How 'The Jungle Book' Fused 'Avatar', Pixar, and 'Iron Man'
Director Jon Favreau on the competitive future of VFX-heavy live-action adaptations.
When Disney began making live-action adaptations of its library of classic animated movies, the project seemed fairly straightforward. The first films they chose, Maleficent was a fairytale filled with human actors. Angeline Jolie’s cheekbones aside, it wasn’t a cutting edge film. It was a traditional story shot in traditional ways. And it would have been easy for Disney to stay on that path, but the studio opted instead to take a left turn into more wild terrain.
The Jungle Book presented a fairly obvious dilemma: Bears and panthers are hard to control but naturally look amazing and CGI is controllable but hard to make look that life-like. Director Jon Favreau had to find a creative solution and, in doing so, he and his team developed new technologies to make “live-action” elements look as real as possible, without terrifying the kids.
“The trick was to look at how others had done virtual environments before us and see how we could work with those technologies, and then figure out a way to get the performance of one individual kid seeming like it was in a live-action movie,” Favreau explains. “And to see if we could integrate the practical and CG elements together so it’d feel live-action.”
Favreau got a lot of help and the expertise of VFX supervisor Rob Legato, who did groundbreaking work on Avatar. The movie proved a critical and box office victory and Favreau found a new way to practice his craft. The filmmaker discussed the detailed process he created with Inverse, offering up some insight into his next Disney live-action challenge.
You did motion capture before you filmed at all. What was the timeline, step-by-step?
We started off as though we were doing an animated film. We had a story department and artists and we were doing keyframe illustrations and color studies and character design, all the things youd do if you were doing a Pixar film. Then we broke the pattern by then treating the film as though it were a motion capture film, like Avatar. So we went to sets on a volume that were contoured properly, but didnt have any art direction. They were just platforms and people wearing motion capture suits, and we recorded all the performances for blocking and camera blocking, and refined the set design at the same time. Then we edited the film, so it looked like an hour and a half-long video game when we were done.
Then once we decided we liked the cut and the performances and voice performances, we then went to the blue screen stage. There we took Neel, who played Mowgli, got him in makeup and costume, built as much set as we needed around him for him to interact with. And then everything else was blue screen with puppets and performers, so we could get each shot that was in our video game cut of the movie. Then we would cut his coverage in, and wed see if it all fit together.
When we were done with that, it went into proper post-production, where it would on a movie like Iron Man. There we built the animation and the characters surrounding him to match his performance. There were really four phases to the whole thing, and each one borrowed from a different type of filmmaking.
How did it work with the animals during the third level of production? Were they not entirely rendered, or fully animated?
We were using a game engine, and using light files so we could move them in real-time, and adjust the environments in real-time. So when we were motion-capturing, you could look on the screen and see the environment, animals, and Mowgli, all moving together. But it wasnt the level of finish or animation that you’d see in a motion capture. It looked more like what youd seen in a video game. There was enough there for you to determine what the cut of the film should be, but not enough to release it for people to see. It gets real expensive when you start getting everything to look photo-real, so you save that final render for the last phase.
Did you have to position Neel in exact ways on stage, or could you place him throughout the frame as needed in post?
When we were doing the early stages, like the motion capture, there was a lot of discovery at that point. But when we started using the 3D cameras and were on the real sets, it was treated as though we were doing a real movie. If you walked on the set, it would look like any other live action film, except there was a lot of blue screen. We were using Simulcam, which was developed for Avatar, where you could look at the monitor and see the blue screen elements mixed with the environments, so you could actually line up your characters with the horizon and other animals. Once you get used to it, it was very intuitive, but for people who came by the set, it looked like something that they’d never seen before.
Was there any technology you had to develop to make this thing possible?
A lot of people who worked on Avatar worked on this film as well. The big difference was that we had a lot of interactivity, so it was a matter of not just using motion capture for performance, but also having the live action elements knit together and being able to touch and in some cases be right on top of and in back of the animals. And to have those connection points are very difficult. The way fur interacts with wet fingers, that’s something they didn’t have to worry about on Avatar. So a lot of fur tools, and water tools have come a long way. We had to generate an environment, too. Avatar was a hyper-real environment, it was another planet. This was supposed to look like the jungles of India, which we had to match very closely so it never took you out of the reality of the piece.
There’s an uncanny valley, and its different for humans and animals, but something still hash to look real. But at the same time, you don’t want those predatory animals to look so hyper-real that they’re terrifying for kids.
You kind of create your own rules system. There are certain things you cant mess with, but certain animals you can get away with [changing] more than others, because it’s all a matter of how familiar the audience is with something. People are very familiar with human faces. The panther’s face, we could exaggerate the proportions a bit and still have people think they were looking at the real thing. And sometimes that helped the performance, the way that we changed it.
We had a lot of fun with scale, to help match up with what was in the 1967 film, when the animals was much larger. We made the environment much bigger and more creative, but definitely drew inspiration from what happened in nature. We didn’t really have problems with the environments of the animals, it usually comes with the anthropomorphic faces of the animals, so we avoided doing anything intentionally that skewed it too much.
You’re going to do The Lion King next, which is a “live-action” film — but I can’t imagine there will be much actual “live-action,” right?
No, there’s not. The puzzle to solve on this one is how do you make it look as photo-real and as much as a live-action film as possible, when the all elements you’re using are being generated digitally? We learned a lot of things from The Jungle Book, and there are a lot of sequences in it that dont include any live-action. And it fits right in, and it feels as though youre watching a live-action movie. A lot of scenes just have the animals.
So what we’re thinking is that if we continue pushing in that direction, and use real animal performances to base the animation on, and if we use real animals to base the designs on, and use real environments to match, and use real camera moves by using the motion capture stage for capturing all the camera in an analog way, it’ll pull us away from the feeling of animation. We’ll find ourselves with the rules you have to stick to if you were doing a live-action movie.
Do you imagine it’ll be as photo-real as the animals in The Jungle Book? Lions can be kind of terrifying.
I think we have to go further. Lions are terrifying at times, but cats carry a lot of emotion in their face. They’re probably the animal species that offers the most possibilities as far as anthropomorphic performance without actually changing the way they truly express things, and the way their faces are laid out. Their snouts are kind of flat compared to a wolf, and their eyes are very human. Lions have round pupils.
All the things you would cheat to make an animal look more human, nature’s already done for us. If you have the proper design, proper lighting, and proper performance, you can get a lot of emotion out of that species. And the story and the music is so strong, we are hoping we are discovering a way that it captures the spirit and emotion of the original animated film, while these technologies all you to feel as if it’s playing out in real life.