Ask a question about League or Riot, and we’ll try to answer it. Answers go live every Thursday at 1:30 pm (PT)
We’ve always aspired to bring memorable and exciting events to Worlds each year, but in 2017 we wanted to see if we could really raise the bar. So when we started laying the groundwork for this year’s Finals opening ceremony, we assembled a team to do something we’ve all, in our nerdiest moments, wished we could do: Summon a dragon.
Bringing League to Life
One of our highest priorities when planning global events is paying tribute to the host nation. With Worlds in China this year, the inclusion of Chinese culture was important. Starting in February of 2017, we slowly pieced together this year’s opening ceremony featuring an authentic Chinese Erhu player, Peking opera masks modeled as LoL champions, an appearance by superstar Jay Chou, and a live performance of “Legends Never Die” by Against the Current. But we still felt like it was missing something special—a twist to reflect the size and scale of Beijing’s iconic Bird’s Nest stadium.
We brainstormed a couple of ideas and eventually landed on using augmented reality, the art of inserting graphics into the real-world environment, as our weapon of choice. Augmented reality, or AR, is created by having a real-life camera control a virtual camera within a render engine, simultaneously merging the two images to give the appearance of the unreal made real. AR isn’t new to sports broadcasting—forms of AR have been used in traditional sports and in other esports events such as Dota’s International (and even our own NA LCS, MSI, and Worlds)—but we wanted to try creating something at a scale beyond anything we’d attempted before.
Initial ideas for AR use during this year’s opening ceremony included Ryze shooting magic around the stadium and Ashe firing arrows out of the projection screen. But we also had this long-running joke for years: “Let’s just have a dragon perch on the stage.” We didn’t have a portal to Runeterra to import a real dragon (yet), and when that joke was born, the technology needed to make a dragon feel like it was actually flying into a real stadium at a high enough quality hadn’t quite been developed (or made affordable). But by 2017, AR tech had advanced to a level that we thought might just be up to the task. We formed a team of some of our craziest technology innovators to try and answer the question of, “Can we make a dragon?”
While television broadcast AR has evolved greatly over the past few years, we were still worried that it couldn’t handle the scope and fidelity needed for the dragon to feel real. The main challenges we knew we were going to face were:
- How do we get a dragon to fly into the stadium and land on the edge of Bird’s Nest?
- How will the dragon cast realistic shadows not just onto the floor, but onto the actual walls of the stadium itself?
- Will we be able to adjust the lighting the day of show in order to match the weather outside? How?
- How do we time its entrance perfectly with the live performance?
Dreaming of Dragons
We wanted to have a forceful and awe-inspiring presence come to life over the skies of Beijing, and there is no monster in League more powerful, intimidating, and badass than the Elder Drake.
We teamed up with some partners of ours to spin up some brand-new concept art to serve as the base for the custom model and animations needed for this new incarnation of the Elder Drake. We considered a lot of different directions, but as we looked at the concept art, we felt most excited about the concepts that enhanced what was already epic about the Rift’s most intimidating beast. One of the most important things was making sure that no matter what we did, players would still immediately recognize that thing swooping down into the venue as their friendly neighborhood Elder Drake.
In order to hit the mark on what the Elder Dragon would look like if it were brought to life, we examined every aspect of the dragon in detail, from the head, to the limbs, to the wings.
Through a lot of iterations on each individual aspect, it all came together to form the final concept art.
With the concept complete, it was time to begin the modeling process. Unlike traditional character modeling for pre-rendered animations, we had to keep a close eye on how detailed we could make the dragon while also still being able to render it in real-time.
What is Real-Time Rendering?
Real-time rendering is being able to produce an animation that is rendered so quickly that it looks like it’s being generated live. This differs from pre-rendered animations, where it can take hours to render a 10-second scene (Toy Story 3, for example, famously averaged seven hours of render time per frame). Even if you’ve never heard of real-time rendering, you’re actually super familiar with it—it’s how your graphics card renders games.
In our situation, we had to be able to render in real-time the dragon flying into the stadium to freely move our cameras around and get our entire opening ceremony in the shot. This meant we had to keep track of the poly count, or vertices, that were being used to form the dragon so that it wouldn’t glitch or lag on-air. This isn’t much different from keeping track of the poly count for in-game models when designing a game; it’s always a careful balance between quality and performance.
Here’s a quick progression of the modeling process:
While all of this was going on, we had a separate team working on the actual animation of the dragon flying into the stadium. We looked at some of our favorite dragons from Game of Thrones, DragonHeart, and others to really get an idea of how a dragon would interact within an environment like the Bird’s Nest. How would it fly around? How would it land? How would it react with people at its feet?
The very first animatic concept had the Elder Dragon bringing in the Summoner’s Cup with it and placing the cup on the stage. Ultimately, the idea of bringing in the cup was abandoned once we opted for the 16-meter inflatable trophy rising up as the dragon circled the venue. Also, the real Summoner’s Cup is too heavy for even a dragon to lift.
We tried having the Elder Dragon land on different parts of the stage and maybe having some sort of interaction with the performers, such as the dancers being scared and running away or ducking out of fear. In the end, we felt that if an actual dragon flew into its den and saw thousands of people there, it would want to make sure everyone knew it was in charge. It’s the Elder Dragon after all, so putting it front and center seemed like the only way to go.
As the animation started to take form, we ran into another issue: The dragon was a bit too big. Basically, its wings glitched through the stadium during its descent, completely ruining the illusion. We also couldn’t see its shadow, which we needed to properly overlay onto the audience to help make the image feel more real (basically, this is what makes it look like the dragon’s shadow is being cast on the actual people in the stadium). We thought about scaling the dragon down, but then we tried doing what a real big-ass dragon would do: Change the flight path. We adjusted the descent in order to make sure the shadows looked right and the scalyboi wasn’t merging with the walls of the stadium.
Next, we listed out the things we’d need for the animation to play smoothly and keep the fidelity we tried to create during the design phase:
- Guarantee animation runs in real-time at the same frame rate as our broadcast, 59.94fps
- Ensure dragon can cast soft shadows on geometry the size and scale of the Bird’s Nest
- Create 3D masking so that the dragon can appear behind the stadium as it enters and leaves, and behind the player sleds when it lands
- Be able to adjust scene lighting at any time, in real-time, in order to accommodate whatever weather scenarios we may encounter
- Include the ability to color correct the dragon in real-time to match the exposure and contrast of the actual cameras
- Use two cameras with tracking capabilities sent to two different real-time render engines
- Be able to trigger the animation at the same time on both engines in order to be able to seamlessly cut between cameras
- Have the animation triggered at a specific point to line up perfectly with the music of the opening ceremony
- Plan and rehearse enough so that camera operators get perfect shots in an environment with so much going on
In order to test that we were able to do all of these things, we got all of our equipment together and summoned the Elder Drake to our parking lot. Here is the test (excuse the dirty camera lens) that got us really excited:
Once we got onsite at the Bird’s Nest, we had only one week to set everything up, make sure network connections were safe and stable for tracking to be seamless, ensure audio for the dragon was sent to the right place, check that the animation trigger was working properly, and, most importantly, see in-camera that the dragon was lining up perfectly in the stadium—all in addition to the usual hectic setup that accompanies any live League event.
We also practiced with the dancers and dragon around six or seven times, during which we discovered that if the camera move and the choreography of the performers weren’t timed perfectly, the dragon would squash the dancers on the stage (not a big deal to him, but sort of a bad look). The dancers had been practicing this timing for a few weeks at this point, so any changes coming from us could potentially throw everything else out of sync.
We decided to hide any squashing with coordinated camera zooms:
All of the shots that you see are done live by a real camera person. We ran through the dragon sequence countless times with our camera operators to make sure they knew where to have their cameras start their shot, the exact movements they would need to make, and when to make their zooms. We wanted the shots to feel imperfect so that it seemed like a dragon caught the camera guys off-guard.
Here’s a view of one of the camera operators getting a shot of the dragon flying over the audience:
Ten minutes before broadcast, we noticed that the position of the masking for the dragon was off (it looked like it was landing inside of the wall, rather than on the edge). This could have been caused by a variety of things—even something as small as the base of the camera being shifted by centimeters simply because of vibrations. Minutes before going live, we had to go inside of our real-time graphics editing application and move the dragon up. We also took this time to adjust its color to fit better with the time of day, so it wasn’t too bright or too dark. Then it was time to send our Elder Dragon soaring into Finals.
The audible gasp from the audience in the stadium when the dragon jumps off of the roof sent chills down our spine. Reading and hearing the reactions from our players regarding the dragon is exactly why we do this. The hope and goal is to always create memorable experiences for our fans and players that they can take home and share with their friends. As long as we can keep working on that mission, we will continue to develop and work on the next great thing. Until our experimentations on our pet eastern indigo snakes produce a successful baby Baron, we’ll continue to challenge ourselves to raise the bar on Worlds and deliver experiences worthy of the passion you pour to this community every day.