Entertainment

Video games are getting Hollywood VFX to level up

Hollywood VFX artists preview how evolving video game technology will continue to make the biggest blockbusters.

by Eric Francisco

Simply playing Call of Duty won't help Disney make the next Marvel movie. But the skeleton systems that power cutting edge video games are giving Hollywood filmmakers incredible new ways to make movies faster and better, further blurring the lines between reality and fantasy.

For decades, the popular image of Hollywood visual effects has been intricate movie sets with green screens and costumed actors reacting to nothing. But it was in the 2010s when Hollywood studios began adopting video game creation software (also called game engines) that allow filmmakers to visualize concepts and ideas instantaneously, sometimes right on set. In recent productions like Disney's The Lion King and The Mandalorian, these effects are live and captured on camera.

What used to take weeks to render on computers now takes seconds, and it's all because of video games. A hundred years after the 1920s when filmmakers experimented with color (and found it to be an arduous and expensive process), several of Hollywood's visual effects artists tell Inverse the next big leap isn't just a few years away. It's already here, in most people's living rooms.

"You're already seeing it," Craig Hammack, Visual Effects Supervisor at ILM tells Inverse; his film credits include Captain Marvel (2019), Skyscraper (2018), and Rogue One: A Star Wars Story (2016). "The technology that video games use, real-time rendering, is starting to make its way into productions. It's enabling film production to get faster and more interactive. And it will eventually make it cheaper and easier to do."

"I think it'll contribute more to shooting than in preview," adds Kevin Souls, VFX Supervisor at Luma whose films include Black Panther (2018) and Birds of Prey (2020). "Technology flows between [movies and games] all the time and games contribute greatly to our stuff because they started in the foundation of needing real-time."

Eric Saindon, Visual Effects Supervisor at Weta Digital, says game engines were used to plan the visual effects of 2019's Alita: Battle Angel before finalizing a shot. "Gaming engines are so fast and so real-time, we're able to use them as more of another tool in our tool belt," Saindon tells Inverse. "We did [on Alita], we're visualizing these scenes with our own engine the same way video games do. We can light the scenes before we shoot them. We can go through this whole process before we ever have to film it."

A sample of the VFX progression of 'Alita: Battle Angel.' On the left, the finished image. On the right, an early stage of the process.

20th Century Studios

Real-time rendering is, as Unity Technologies puts it on its website for the Unity engine, "the process of producing an image based on three-dimensional data." In other words, it's the function of creating dynamic images as they happen, most often in a video game environment. "3D images are calculated at a very high speed so that it looks like the scenes, which consist of multitudes of images, occur in real-time when players interact with your game."

For movies, this is a massive time-saver as filmmakers can see environments and even objects on the spot, rather than wait weeks for VFX staff to create them after shooting has wrapped. They even act as a visual aid for actors, who no longer have to believe there's a great big dragon when, in reality, they're looking at a green wall. In a May 2019 story from Variety, actor Donald Glover's face lit up (literally) when, on the set of 2018's Solo: A Star Wars Story, a high-resolution video animation of hyperspace appeared before his eyes on a 4K projector set up by Lucasfilm's Industrial Light & Magic.

"The best hires we can make are guys who come from gaming backgrounds."

But the technology making this possible was achieved because of gaming, where the ever-increasing demand for photorealism pushed gaming's into high-fidelity visuals faster than Hollywood movies could keep up. A leader in the space is Unreal Engine, an industry favorite that powers hits like Fortnite, Gears of War, Mass Effect, and the Batman: Arkham series. Created in 1998 by Tim Sweeney (the founder of Epic Games), the engine was named after the studio's hit shooter Unreal and its popular multiplayer spin-off, Unreal Tournament.

The earliest iteration of Unreal Engine was made by Sweeney as a way to help a friend, James Schmaltz, develop his own game. Before he approached Sweeney, Schmaltz made his game's forest environment by hand. (A technical term for this is BSP, or binary space partitioning.)

"When I saw that, I was like, 'No, no, no, James. This is not how we do things,'" recalled Sweeney in a 2017 Gamasutra interview. So he pitched in with a time-saving method: A "real-time" tree maker. "The idea is that you can reposition brushes in 3D space, and then all the BSP work is updated completely in real-time," he said. "This had some mind-boggling implications."

Sweeney was just trying to help a friend save time making his game. Today, Unreal is used by Hollywood to create some of the biggest movies. One of its biggest evangelists is Jon Favreau, the director of blockbusters such as 2008's Iron Man and 2016's The Jungle Book, and showrunner of the Disney+ hit The Mandalorian.

Pre-visualization still of 'The Lion King' (2019).

Walt Disney Studios

Jon Favreau (far left), cinematographer Caleb Deschanel (green), production designer James Chinlund (blue), visual effects supervisor Robert Legato (red), and animation supervisor Andy Jones (white) interact on the virtual set of 'The Lion King.'

Walt Disney Studios

During filming on The Jungle Book, Favreau grew frustrated by the overly long process it took to visualize his movie. "I started to get very frustrated with how long it took in production," Favreau said at SIGGRAPH, an annual computer graphics conference, last July. "Also there was so much CG environment, that no matter how much you planned it, you were still pushing a lot of the decision-making to post-production.”

The Jungle Book led Favreau to virtual reality, which in turn led to the ambitious production of The Lion King. Using a combination of VR and real-time rendering, Favreau made his remake of the 1994 classic inside a lifelike digital environment, which they could edit on the fly using handheld controllers. The film's unconventional shoot included the crew 3D printing replicas of their equipment in order to "feel" like they're actually making a movie on an African plain.

As IGN described in a "set visit":

With a controller in each hand, I was able to speed along the ground or fly through the sky. I could move trees, bushes and rhinos at a whim. It was a virtual toybox where every aspect of the world was customizable. With the push of a button, a toolbar filled with numerous pieces of filmmaking equipment popped up, allowing me to summon film lights, draw designs mid-air with a marker, or set up a camera and start shooting. Through this process, the whole movie is laid out, filmed with video-game graphics, and then sent on to be finalized with lifelike visual effects. I just hope they remembered to take down that rhino I left in the sky.

At SIGGRAPH, Favreau revealed how such a tool is not only more efficient but a complete relief. “When you save those iterations, you can save so much work with the people who are actually the technicians and the artists," he said. "I think it’s debilitating to have so much work being done that doesn’t hit the screen. So it maximizes the efficiencies, and also I think creates a sharper, more focused collective of artists that are working on the vision.”

In making The Mandalorian, Favreau continued implementing the technology and credited Epic's Unreal for making shots happen "in-camera," just as Kevin Souls predicted. “We got a tremendous percentage of shots that actually worked in-camera, just with the real-time renders in-engine, that I didn’t think Epic was going to be capable of."

Demonstration of the Unreal Engine as a filmmaking tool. Not only can edits to the environment happen on the fly, they are also fully rendered in real-time.

Epic Games

Richie Baneham, Visual Effects Supervisor at Lightstorm, who won an Oscar for his work on James Cameron's 2009 epic Avatar, says that despite games and movies having "a different goal" at the start of production, the disciplinary overlap means film artists will find work in games and vice versa.

"I find the best hires we can make are guys who come from gaming backgrounds who are hungry to learn narratives," he says. "And those who have narratives to tell, end up on gaming and find technology that allows them to express their intentions. We're all in the same industry."

Related Tags