CGSociety :: Production Focus
7 November 2013, by Paul Hellard
Matthew Butler has been the VFX Supervisor on some of the best movies being produced out of Digital Domain over many years. Butler is also known to bring his not unsubstantial aeronautics and physics qualifications from Manchester and MIT to the job with some brilliant results. Some might remember the Chicago parachute sequences in Transformers: Dark of the Moon
Digital Domain was the principal studio involved in the VFX of Ender’s Game, although Vectorsoul, Post 23, Method Studios, The Embassy and Comen VFX also worked on many sequences, and G Creative and Goldtooth Creative generated motion graphics for the movie.
Butler came onboard the Ender’s Game
production with a view to recreating some of the scenes in this well-loved book by Orson Scott Card. The story of Ender’s Game
was adapted for the screen and then directed by Gavin Hood. This ownership of the screen version is said to have endeared the crew to the director immediately the production began.
"I think it’s always important to pay attention to depicting something that is physically correct," says Butler from the DD Studio in Santa Monica. "Whatever movie it is, we can immediately see when things move incorrectly." Much like a defence mechanism, it is like the closer you get to the 'uncanny valley', the more 'just wrong' a digital double character will appear. The same with movement in zero-G.
CENTER OF MASS
Getting the action of a body in zero-G, being weightless in space, is one of the hardest things to replicate on screen. If a character is floating in space, but moving up and down, it flags immediately in people's minds that something is just wrong. But if you animate the person to freely wave his arms and legs around, he will bob up and down because his centre-of-mass will be adjusting as he makes that move. In the natural world this is what happens, but it looks quite wrong. "One of the advantages of me studying aeronautics was that my roommate in university was Greg Chamitoff, who flew on STS-124 to the ISS in 2008. He spent six months in orbit," explains Matthew Butler. "He came across to Digital Domain and trained the animators, actors and stunt performers on what to expect when they are in that zero-gravity environment. First hand advice from someone who'd been there."
When Greg Chamitoff was up in the ISS, the team was installing the Japanese science module, which was pretty big. Before the huge computer racks were put in place, they did some experiments where they'd place someone out in the middle of the big room. They could not move because they couldn't move their centre of mass at all. The Digital Domain crew then used that know how to write the tools to emulate those kinds of behaviors. Many events like this would have had to be not shown in the Ender's Game scenes but with the plugins, the behaviour of the centre of mass could be aligned with the movement of the person in shot, and the new correct physical path could be calculated and inserted into the action, on the run.
“With the cunning use of some multiplication,” Matthew quips, "we can stabilise that in a 3D world before rendering. When it came to projecting the actors and stunt people doing their thing in the stage, this was matched up with the CG version we'd prepared, and then render it out with all the suits and such."
Another tool was created to allow the face to align with the best-fit scenario for how the actor was holding his head. This allows for intelligent decisions on perspective shifts but also minimises what Butler calls the ‘bubblehead effect’. “We don’t move our head and limbs around, independently of others things,” he explains.
“The curse of our industry is that we can make it look correct but it also has to tell the story and ‘look cool’,” Butler says. VFX dupes are always having to make the shot entertaining even if the physics turn out looking wrong. “There is a shot in the movie that looks fantastic, but it is just not physically possible,” he adds. “Our job is to make it look as possible as you can get away with, but while still looking cool.”
The zero-G battle room is one of the novel’s most iconic places and Digital Domain created Screenwriter/Director Gavin Hood’s ideas of it not as a dark, enclosed space, but as a visually epic environment where the earth can be seen below through clear surfaces. Nearly all the digital doubles created for Ender’s Game were in the zero-G Battle Room. In the simulation cave it was all about the environment around them and DD artists didn’t have to change any of the live action work there. “Mind you, there is an established horseshoe tiered arena and this is the area we had most use of digital doubles,” explains Matthew Butler. “We always try and keep the performance of the actor and hook that into the CG bodies. Say, delivering lines would never be a CG face.” There was always a balance though between having to be animated so a performance could be corrected, to having to be live action, so they could deliver a line.
Digital Domain’s animation team developed tools that allowed artists to correct for that movement, then re-projected digital doubles (or parts of their bodies) back into the shots. Because Digital Domain had developed CG versions of the actors’ flash suits and re-created the lighting environment digitally, they were able to keep the actors’ faces from the live action shoot and replace nearly all of the body motions with digital doubles.
“One of the more elaborate digital double setups we had was when we had Hayley playing Petra, is knocked against another lead. The digital doubles were everywhere and the need to reproject to get the correct physics was the usual reason for that. Everywhere we reprojected, there was a digital double,” adds Butler. However sometimes, there were ten or so different elements that influence the decision you need to make between being live or animated. The classic ICT Lightstage was used in a lot of shots so there would be an option of using a digital comp if the need arose.
“In the simulation cave, we start off with the lights on, and we are inside a giant termite mound,” says Butler. “When the lights go off, there is a holographic projector that can protect back to infinity and we’re unbounded in terms of what we can see.” It is at this point that the characters are fully immersed in photo-real battle scenes. As these scenes are run through, there is so much going on, at one point there were 27 billion polygons that had to be crunched. Each of the battle ships in these sequences were individual entities as well. The Formics and the International Fleet were individually modelled, textured and lit. The dynamics were also quite specific. The organic motion of the swarming Formics, against the grey militarized performance of the International fleet, is very much modelled on flocks of birds are seen flying in formation. “YouTube is such a great reference for this kind of thing,” adds Butler. “A lot of time was spent getting the flying formation of the fleet to work as a flock of bird. There were millions of them, so this had to be in a system which we could direct.”