Ask a question about League or Riot, and we’ll try to answer it. Answers go live every Thursday at 1:30 pm (PT)
/tech is a new Nexus series exploring the technology that powers League of Legends. If you like this story, consider checking out the Riot Games Engineering blog for even more complex deep-dives.
One of the biggest challenges of developing a game as big as League of Legends that ships as frequently as League of Legends is making major changes to the tools we use to build the game without breaking anything. Hello, I’m John “Riot Jengles” Englund, a technical artist who works on our art tools, here to share how we delivered a brand-new visual effects editing tool in time to ship Kayn, our newest jungle assassin, who can walk through walls and possess other champions. (But not my champions! I play immobile ADCs and will be adopting a strict “always ban Kayn” policy in champ select.)
A TEAM DEDICATED TO ITS NAME
On our team, the mission is to create tools and tech that enable Rioters to deliver the best for players (by “best,” we mean whatever that team defines as best in the context of their work). We do that by providing software that aims to make the iterative loop of content creation as efficient as possible. That’s why our team is aptly named Content Efficiency! We are responsible for the back-end platform that hosts all game data on LoL, as well as the tools for designers and artists to edit properties for champions, skins, spells, and more.
Late last year, we decided it was time to deliver a brand-new tool for editing VFX. Why? League has over 110,000 different VFX. It’s a huge part of what brings action in the game to life, and players often tell us how VFX make or break their enjoyment of champions, skins, items, map effects, and more.
If you want to know more about the art of VFX, check out this article by RiotPhoenix, a VFX artist on Kayn that helped us develop and test the new tool!
You can see how high quality VFX is integral to the game, but our workflow for authoring VFX was starting to show its age. There was no “undo” button, which made it hard to experiment with changes. VFX artists also had to keep the game running, save their effect, and wait for it to reload, every time they wanted to see the results of their work. There was no way to see it live. Lastly, all the VFX data was stored in our old file system, which required a lot of manual management of folders and filenames to get it to work correctly, with no easy way to track what files were being used for an effect in-game.
SO WHERE’S THE UPGRADE BUTTON?
Building a better VFX tool wasn’t easy, but surprisingly it wasn’t the most complicated part. Deploying that new tool into a production environment with aggressive deadlines, converting the old data into the new system, and making sure artists could transition with minimal disruption—that was another story.
The first thing we needed to do was make sure our VFX artists were on board with this idea, excited about it, and involved in the development process. There’s no way we could have built the right thing without their expertise. VFX artists were already accustomed to and happy with the editing experience of their current VFX tool (despite some of its shortcomings), so the bar was set pretty high to give them something better.
Plus, it had this delicious loading screen:
Now, how do you top this? We started with a lot of ideas on how to improve the existing interface—fancier graphs, new spiffy tables, and clever simplifications—but in the end the major bottlenecks in the original workflow had little to do with the interface itself, so it behooved us to simply implement most of the the functions of the old tool to our new system. Even the best improvements to a software interface come at a temporary efficiency cost (people re-learning the tool, for example), so we opted to provide a familiar interface to ease their transition.
We spent the bulk of our engineering effort towards creating a smooth transition for old data and VFX artists. In addition to building a familiar interface inspired by Syrup, we created tools to help artists convert old data—textures, meshes, and particle systems that make up VFX—to the new system. We even made sure artists could copy data from the old tool and paste it into the new tool.
So, a new tool that we are confident that artists will be able to use is great and all, but how do we deploy it and use it for real things?
WELCOME TO PARTICLE TOWN, MR. KAYN!
We needed to find some sweet VFX data to bring into the new tool, which we dubbed “Particle Town.” Something meaty and already in progress to test converting data, but not too far in development that we might threaten a production deadline.
So we talked to the Champion team, and they had just the thing: a champion with three different forms and VFX dialed up to 11.
Enter Kayn, the Shadow Reaper.
Now, the Champ team deals with deadlines of their own, and we didn’t want to guarantee this tool would make them faster right away. In fact, it was likely to make them a little slower, because of the pain of change! While we were confident this tool would be more efficient in the near future, it was important to set realistic expectations, and explicitly commit to doing whatever we could to help make sure Kayn shipped on time.
So what did we do? We spent three months building a MVP (minimum viable product), during which time our entire team was focused on building this tool. We spent a lot of time going back and forth with VFX artists to maintain a good level of confidence that we were building the right thing. Some things got scaled back (like a fancier graph editor) not because the design wasn’t cool, but because we discovered other things were of higher priority. We knew we’d have time for polish after releasing something simple. Another good reason to start with something minimal that can be built quickly is so you can rush to getting feedback about the tool. Direct user feedback from experts is invaluable towards future improvement, and saves us time that might be wasted building features that wouldn’t have been used much.
Finally, at the end of March, we released it…and it did start a little painfully! Data came in broken, the tool sometimes slowed down to a crawl, or even crashed. It was crucial we had good relationships with the artists and on-the-minute support during this transition. We got around 100 pages of feedback and over 150+ bug reports and feature requests, and we didn’t account for the fact that filling out these requests slowed artists down too! We plugged away as fast as we could; some days we were pushing out fixes to the tool over 10 times a day.
Fortunately, before too long, things started to stabilize and we could observe the fruits of our labor. We got feedback from artists who had hours of work recovered by the undo system, and others loving the fact they could see their changes reflected immediately in the new viewer and in-game. Most excitingly, artists discovered some things that made Kayn faster to build than he would have with our old tool!
One example is Kayn’s ultimate. It can be cast in any direction, which meant it had to be tested for visual quality in every direction. Previously, artists would have to load the game, select a target, cast R on it, move the target, cast again, and repeat until they’ve seen it from all angles. Now with the preview window, they could just move the target around and see it play live in any direction.
Another interesting example is the on-hit effect for Kayn’s Q. We had to ship two versions, one with blood, and one without, due to different requirements for different regions. Normally, the VFX artist would have to load the game to see the effect; now they could just toggle “censored mode” in the preview window.
COMING TO A PARTICLE TOWN NEAR YOU
So what did we learn? Shipping a tool of this complexity in the middle of a busy production schedule is very tricky. But we already knew that. What we did learn was that we are getting better at it. We’ve had much rockier rollouts in the past. This one was made great because we brought everyone affected by it (artists, developers, producers) on the journey with us. We probably spent as much time communicating, setting expectations, and learning from our peers as we did building the tool. This whole process, while not over, will hopefully give us the confidence to keep making bold changes to our art workflows here so we can continue to deliver the best content we possibly can to all of you guys.
We also might be able to use VFX a little more liberally in the future, if we can make them easier to create! What do you think, where should we put some more shiny VFX?