Ender's Game

Fri 8th Nov 2013, by Paul Hellard | Production

CGSociety :: Production Focus

7 November 2013, by Paul Hellard

Matthew Butler has been the VFX Supervisor on some of the best movies being produced out of Digital Domain over many years. Butler is also known to bring his not unsubstantial aeronautics and physics qualifications from Manchester and MIT to the job with some brilliant results. Some might remember the Chicago parachute sequences in Transformers: Dark of the Moon and others.

Digital Domain was the principal studio involved in the VFX of Ender’s Game, although Vectorsoul, Post 23, Method Studios, The Embassy and Comen VFX also worked on many sequences, and G Creative and Goldtooth Creative generated motion graphics for the movie.

Butler came onboard the Ender’s Game production with a view to recreating some of the scenes in this well-loved book by Orson Scott Card. The story of Ender’s Game was adapted for the screen and then directed by Gavin Hood. This ownership of the screen version is said to have endeared the crew to the director immediately the production began.

"I think it’s always important to pay attention to depicting something that is physically correct," says Butler from the DD Studio in Santa Monica. "Whatever movie it is, we can immediately see when things move incorrectly." Much like a defence mechanism, it is like the closer you get to the 'uncanny valley', the more 'just wrong' a digital double character will appear. The same with movement in zero-G.


Getting the action of a body in zero-G, being weightless in space, is one of the hardest things to replicate on screen. If a character is floating in space, but moving up and down, it flags immediately in people's minds that something is just wrong. But if you animate the person to freely wave his arms and legs around, he will bob up and down because his centre-of-mass will be adjusting as he makes that move. In the natural world this is what happens, but it looks quite wrong. "One of the advantages of me studying aeronautics was that my roommate in university was Greg Chamitoff, who flew on STS-124 to the ISS in 2008. He spent six months in orbit," explains Matthew Butler. "He came across to Digital Domain and trained the animators, actors and stunt performers on what to expect when they are in that zero-gravity environment. First hand advice from someone who'd been there."

When Greg Chamitoff was up in the ISS, the team was installing the Japanese science module, which was pretty big. Before the huge computer racks were put in place, they did some experiments where they'd place someone out in the middle of the big room. They could not move because they couldn't move their centre of mass at all. The Digital Domain crew then used that know how to write the tools to emulate those kinds of behaviors. Many events like this would have had to be not shown in the Ender's Game scenes but with the plugins, the behaviour of the centre of mass could be aligned with the movement of the person in shot, and the new correct physical path could be calculated and inserted into the action, on the run.

“With the cunning use of some multiplication,” Matthew quips, "we can stabilise that in a 3D world before rendering. When it came to projecting the actors and stunt people doing their thing in the stage, this was matched up with the CG version we'd prepared, and then render it out with all the suits and such."

Another tool was created to allow the face to align with the best-fit scenario for how the actor was holding his head. This allows for intelligent decisions on perspective shifts but also minimises what Butler calls the ‘bubblehead effect’. “We don’t move our head and limbs around, independently of others things,” he explains.

personal personal

“The curse of our industry is that we can make it look correct but it also has to tell the story and ‘look cool’,” Butler says. VFX dupes are always having to make the shot entertaining even if the physics turn out looking wrong. “There is a shot in the movie that looks fantastic, but it is just not physically possible,” he adds. “Our job is to make it look as possible as you can get away with, but while still looking cool.”


The zero-G battle room is one of the novel’s most iconic places and Digital Domain created Screenwriter/Director Gavin Hood’s ideas of it not as a dark, enclosed space, but as a visually epic environment where the earth can be seen below through clear surfaces. Nearly all the digital doubles created for Ender’s Game were in the zero-G Battle Room. In the simulation cave it was all about the environment around them and DD artists didn’t have to change any of the live action work there. “Mind you, there is an established horseshoe tiered arena and this is the area we had most use of digital doubles,” explains Matthew Butler. “We always try and keep the performance of the actor and hook that into the CG bodies. Say, delivering lines would never be a CG face.” There was always a balance though between having to be animated so a performance could be corrected, to having to be live action, so they could deliver a line.

Digital Domain’s animation team developed tools that allowed artists to correct for that movement, then re-projected digital doubles (or parts of their bodies) back into the shots. Because Digital Domain had developed CG versions of the actors’ flash suits and re-created the lighting environment digitally, they were able to keep the actors’ faces from the live action shoot and replace nearly all of the body motions with digital doubles.

“One of the more elaborate digital double setups we had was when we had Hayley playing Petra, is knocked against another lead. The digital doubles were everywhere and the need to reproject to get the correct physics was the usual reason for that. Everywhere we reprojected, there was a digital double,” adds Butler. However sometimes, there were ten or so different elements that influence the decision you need to make between being live or animated. The classic ICT Lightstage was used in a lot of shots so there would be an option of using a digital comp if the need arose.


“In the simulation cave, we start off with the lights on, and we are inside a giant termite mound,” says Butler. “When the lights go off, there is a holographic projector that can protect back to infinity and we’re unbounded in terms of what we can see.” It is at this point that the characters are fully immersed in photo-real battle scenes. As these scenes are run through, there is so much going on, at one point there were 27 billion polygons that had to be crunched. Each of the battle ships in these sequences were individual entities as well. The Formics and the International Fleet were individually modelled, textured and lit. The dynamics were also quite specific. The organic motion of the swarming Formics, against the grey militarized performance of the International fleet, is very much modelled on flocks of birds are seen flying in formation. “YouTube is such a great reference for this kind of thing,” adds Butler. “A lot of time was spent getting the flying formation of the fleet to work as a flock of bird. There were millions of them, so this had to be in a system which we could direct.”

The Director Gavin Hood also re-imagined the concept of the Command School’s simulator, taking it to an immersive, grand interactive space. In keeping with Hood’s direction that the Simulation Cave should not look like imagery projected onto surfaces, Digital Domain created a holograph-like space in which Ender and his team are fully immersed in a photo-real version of the ‘games.’ In a departure from typical approaches to shooting this kind of environment, the filmmakers lit the content of the displays and the environment independently – then deliberately took artistic license to break that pattern for the final explosion of the Formic planet.

personal personal

The content displayed in the sim cave is a volumetric holograph that allows Ender to control his point of view of the ‘game.’ The graphical readouts and control mechanisms he uses to drive his POV (created by VFX companies G Creative Productions Inc. and Goldtooth Creative Agency Inc.) are integrated with the photo-real environment. As Ender moves through the grid the graphics are entwined with his reality.

Four key scenes take place in the sim cave: Ender’s introduction to the environment by Mazer Rackham and three ‘simulated’ battles with the Formics -- the ice battle, the battle where Ender loses control and his actions result in chaos and crashing ships, and the final battle where Ender commands his ships brilliantly and orchestrates the destruction of the Formics’ home planet. Throughout all of the battle sequences, VFX Supervisor Butler focused on maintaining physical accuracy in behavior, size and scale of all ships, planets and elements in outer space, and explosions that reflect the proper dynamics of space. Digital Domain also created movements for the Formic and International Fleet ships that are clear and distinctive – Formic ships behave like an organic swarm reflecting a hive-like mind while the IF’s ‘pilotless’ human-driven drones are regimented, systematic and mirror behavior learned in the zero-G battle room; moving in unison but individually intentional.

The final battle is the source of the most compute-intensive, geometry-heavy effect sequence Digital Domain has ever created. At one point when the IF fleet surrounding the ‘Little Doctor’ engages with the Formic fighters, there are close to 100 million ships on the screen simultaneously. Those, together with the ‘Little Doctor’ itself comprise more than 27 billion polygons in a single shot.

personal personal


This movie is a very deep message about some society’s practice of taking the children away to train them how to be killers. In this animated sequence of the movie called The Mind Game, Gavin Hood acted the part of the giant and was motion-captured along with Asa Butterfield and Abigail Breslin, the two leads. “The beautifully disturbing,” adds Butler. “We collaborated with a small group of artists to bring a very different look to this sequence of avatars.”

“This bunch of guys is called Vectorsoul and Post 23, primarily based out of Barcelona in Spain. They are a disparate band of artists who hadn’t done any movies but have a record of creating the most incredible work. Really ‘out-there’. They helped with a lot of the animating of the Mouse in this sequence,” said Butler. These were based on concept drawings and applying the actors’ captured performances onto their characters, animating them and integrating them into the final sequences. Digital Domain then created the human characters and the butterfly.

“We managed to create an almost staccato military geometric formation of the International Fleet because a lot of them are computer controlled,” On Youtube there are some amazing videos of students who have come up with quad-copters that can do all kinds of things in the air and stabilize perfectly. Although there is order and structure, there’s a kind of looseness and organic sense to the movement but it is completely computerised. The Digital Domain crew modeled the International Fleet on that. “We tried to make sure there was still a sense of individuality between any two,” Butler adds.

Floating around in zero-G, as mentioned before, is one of the hardest effects to do well. “I’m really pleased with the way the zero g sequences came out,” says Butler. “They’re really beautiful, quite stunning and very different. To my eye it looks quite correct. Gavin Hood was careful to show the first sequence with the students first getting used to the effect, with the stark lighting and the Earth rolling past down below. The classic way we’ve all seen EVA footage. But the movie progresses, each zero-G sequence is in softer and softer surroundings, until the latter scenes are shot in that classic ‘golden hour’ or ‘magic light’. The final zero-G scene is in pitch black with the weapons the only props that paint light into the scene. “In the final battle is also so rich, there’s so much going on. It’s a spectacle, and please everyone, go and see it on the biggest screens you can,” Butler says.

Autodesk’s Maya is the predominant package for modeling in at Digital Domain and as always, SideFX’s Houdini is the heavy lifter for the effects and rigid body dynamics. “So when you see the ice Shelf being exploded on Ender’s Game, it all comes down to Houdini, even though we are also using proprietary and third-party solvers, the front end is Houdini,” he adds. The ability to compute the dynamically changing center of mass, to stabilise that and the tool to turn the 3D face to camera is the job of Maya. Getting that live action faces to match correctly with the animated bodies was something that we had to do perfectly, and we simply would not have been able to do that without Maya, and the brilliant creatives behind the screens.”

All images: Motion Picture Artwork ™ & © Summit Entertainment, LLC. All rights reserved.



blog comments powered by Disqus