Epic Games | 3Lateral
Next project
Previous project
https://static.3lateral.com/portfolio/apex-legends.jpg

The Game Awards: Apex Legends

December 2019

Interview with David Lawson and Lisha Tan, Co-Directors on Respawn Entertainment’s ‘Live from Mirage’s Holo-Day Bash | The Game Awards’ and Tawfeeq Martin, Technical Innovations Manager.

1. How did The Mill first get involved with this project?

The project came about as a continuation of an already-well established collaboration. The Respawn team wanted their beloved Apex Legends character Mirage to crash The Game Awards 2019, along with his larger than life personality… and to do so on stage and in real-time! It was essential to the team that Mirage’s interaction with The Game Awards’ host Geoff Keighley was live, in order to showcase and leverage the actor Roger Craig Smith’s comedic talents. The awards show was set for Dec 12th, so there was a huge amount to achieve in a short amount of time, but we were ready for the challenge!

2. Who were the other partners on this project, and what did each of them do?

This project was only possible with incredible teamwork and collaboration from our multiple partners. Respawn Entertainment supplied us with the all-important game assets and audio files. Animatrik Film Design installed the optical performance volume and live linked their body performance and moveable properties into our real-time scene. Cubic Motion then streamed the intricate facial performance.

AXS supplied the stage venue and was equipped with fibre optic lines to the Microsoft Theatre. The Mill was responsible for creative oversight, building out the real-time scene, using the Unreal game-engine, look development to match the iconic Apex Legends cinematic trailers style, audio implementation and overall gamification.

3. What made you decide to do it live instead of pre-recording?

The idea was always to do it live and in real-time to ensure the experience was one of a kind. Our co-director, Respawn Entertainment’s Creative Director Drew Stauffer, thought it would be amazing to create this never-before-seen blend between the virtual world and the real world allowing a virtual Mirage to interact ‘in person’ with Geoff Keighley in a seamless interaction. Like two pals in front of a live audience!

If it was prerecorded the audience would’ve been able to tell due to the pauses and timing. We received a great response from the audience, once they realized this was done in real-time and not pre-recorded. From the start we knew that fans of the game will enjoy a treat from Mirage, and delivering in real-time was a whole lot more rewarding, plus it begs the question… what is next?

4. Why did you choose to use Persona as the facial solution?

We have maintained a close eye on Cubic Motion’s head mount camera development since ‘Senua’s Sacrifice’, ‘Meet Mike’ and ‘Siren’. Impressed with the quality of their work and seeing their work on bringing other prolific gaming characters to life, gave us a boost in confidence in utilizing Persona as the live show’s facial performance tool. We’ve collaborated with Cubic Motion plenty of times in the past; they have been an integral part of the process and made converting our rig to be compatible for real-time more efficient. Persona is the most sophisticated software in the market for what we were trying to achieve.

5. What is the process of taking a game asset onto the big stage?

Advances in hardware and real-time render engines allowed us to find common body and face rigs compatible across all partner platforms without compromising too much of our cinematic production rig. With that, decimation of our hero character was kept to a minimum. Plenty of preparation and rehearsal was necessary; we rehearsed with the technology, space and actors all prior to the live event. Once the facial ID was trained up to translate Roger’s facial performance, the whole team was able to collectively sigh in relief! It was incredibly challenging to coordinate the various technological elements for a live event which you don’t have the opportunity to recreate again.

6. Tell us a bit about the logistics, how were things configured?

The actor Roger was in a mo-cap stage across the street from the Microsoft Theater, but was able to see Geoff and the audience via monitors throughout the show. When selecting the mo-cap stage, it was important for us to not inhibit Roger’s performance, so he could react freely with wide range of motion in response to any unscripted curveballs that Geoff or the audience might throw at us. Maximizing the volume (roughly 20 x 20ft) also allowed added mobility on virtual camera operator mobility.

7. What were the actor’s experiences of the whole process and the final event?

This was Roger’s first live mocap experience. He perfected Mirage’s movements and brought a lot of his own ideas in small nuances which made the experience a lot more authentic to the character. The team shared admiration that Roger was able to stay inside the motion capture suit all the way through, from go-day morning rehearsals to the end of our live segment. The head mount rig did come off every now and again for refreshments, but was an easy calibration to get back up and kept us in good favor with the assistant director!

8. What was it like on the day of the event?

One funny anecdote, we left the clapper in the scene and did not realize until the live experience was over. The CG clapper left in the environment is our Starbucks-Game of Thrones moment!

9. Do you have any plans to do more live performances?

The Mill is very excited to continue to execute more of these projects within this space. The challenge is to educate our clients and partners to rethink our production pipeline. As this method forces us as creatives to actually put a lot more emphasis on the pre-post process rather than post-production process. The emphasis is more in pre-production in order to deliver high quality content to a live audience. This project proves that any idea is possible! We think now we will be able to execute long format scripted content, especially more scripted character base content. There is also the possibility of a multi-camera audience talk-show or sitcom. Building a few CG environments, having a live audience, treating them as real-time actors instead of animated characters… We would also like to craft more comedic moments that feel like a gift to a fan base.

10. What advice would you have for anyone else thinking of doing a live event?

Two words for you, call us!

more

Related Projects

Want to find out how we can help support your projects? Contact our solutions team.

Privacy statementTerms & conditions© Copyright 2024 3Lateral
Epic Games | 3Lateral