Mon 10th Dec 2012, by Paul Hellard | Production
From visionary director Ang Lee and based on the best-selling book by Yann Martel, Life of Pi is an inspirational story of a young man who, upon surviving a shipwreck at sea, is hurtled into an epic journey of adventure and discovery.
Several studios’ VFX crews worked to make the visuals of Life of Pi a reality. The new horizons of water creation were breached to bring the full effects to the screen, as well as an immense amount of recreative animal characters, to be believed as completely real. Another area of insane effort must be recognized in the lead character actor. Suraj Sharma only accompanied his brother to the auditions to keep him company and thought he’d try out as well. Months later, he had the lead role. Never acted before, never swam before. No pressure!
Much of the production is set in the world’s largest self-generating wave tank ever designed and built for a movie production. Located on the site of a former airport in Taiwan, the tank measured 70 meters long, 30 meters wide and four meters deep, with a capacity of 1.7 million gallons, which allowed the filmmakers to generate a range of water textures.
The Life of Pi production was also shot around 200,000 square feet of studio and office space near Pondicherry’s historic Muslim Quarter in India. The production filmed on 18 locations in and around, and a crew of 600 (almost half of them locals) worked on the opening sequences of the film. Approximately 5,500 local residents were hired as background actors for the magnificent exterior scenes. The production also transformed the town’s Botanical Gardens into the fictional Pondicherry Zoo.
The original photography was mostly taken in a wave tank in Taiwan or occasionally on a gimbal, the largest ever built for a movie. It was impossible, and quite dangerous, to re-create physical waves in the tank to the scale Ang wanted. This meant that in the sequences created by MPC, they had to replace the real water with CG water in every shot. In addition to the stereo element, what made their work even more challenging was Ang’s use of very long takes instead of multiple cuts. “This is absolutely fantastic to immerse the audience in the 3D footage but a lot less forgiving for us since it gives the viewer a lot of time to stare at every single detail,” explains MPC VFX Supervisor Guillaume Rocheron.
“On top of dealing with the extremely complex simulations, our main challenge was to provide Ang with a way to design and choreograph the shots, with as few constraints as possible. It is well known that this type of simulation work can be very frustrating because every change means a new simulation and every simulation gives you some slightly different results on each iteration. This can become a real problem if it starts changing timings or events that you wanted to keep and we really wanted to avoid going through those loops, especially knowing how well timed everything would have to be for the various storytelling events.”
The MPC VFX crew used Flowline for their large scale fluid simulations, so the FX and R&D teams worked with Scanline and put in place a method they called ‘Refined Sheet’. The concept was to take an existing heighfield or geometry that represents the main ocean shapes and emits a thin sheet of voxels over it to simulate interactions and water. “This represented two major advantages for us,” says Rocheron. “We would be able to use an already established wave layout to solve the water motion and most of the computing power would be used to simulate only a few feet of water depth instead of having to simulate from the bottom of the ocean which allowed us to reach the amount of details that was required for our shots.”
The MPC R&D team came up with a layout toolset based on Tessendorf deformers and Gestner waves to allow their artists to create a non simulated, geometric representation of the ocean, with a lot of controls to tweak every single component individually. They would start with a pre-established ocean template, with realistic properties in terms of wave size and timing but then they were able to add, remove, shape or keyframe individual waves.
“We really ended up treating the ocean like a character, having layout artists and animators keyframing the base layer of waves that would then drive our final simulation,” he explains. “We were able to animate that base layer, review it with Ang and Bill Westenhofer, the overall VFX Supervisor, and get the shots locked, in terms of layout and design, before we started the simulation work, which was a real game changer for us.”
“We would then sim the water surface using our Refined Sheet technique and then simulate what we call elements; based on the now simulated water surface, we would emit spray, that would then become bubbles when colliding back with the surface or mist if caught by wind and bubbles would then become foam when rising up to the surface. The amount of wind in hurricane-like conditions, meant we had to simulate a wind field above the ocean surface to fill the atmosphere with mist and rain. We ended up spending a lot of time simulating all of these since they were the key to the visual complexity of a storm. There is one shot where we simulated 1.5 billion water droplets!”