• {digg}
    10,000 BC
    There are more than 10,000 steps to making history.
    CGSociety talks to DNeg, Nvizage and MPC.
    In the beginning
    Creating life is just not as easy as it used to be, especially when you can practically feel its breath on your face from a high def image 80 feet wide. But this was the task for the VFX teams on 10,000 BC with its stampeding mammoths, a saber tooth tiger, vultures, a flock of terrorbirds, and thousands of digital slaves, totaling roughly 350 shots involving CG, with additional 2D comps, paint outs, and wire removals. 300 complex shots were split between Moving Picture Company (MPC) and Double Negative (D’Neg). Previs work came from Nvizage, with the remaining 209 shots distributed between Machine Effects and The Senate. But these were no ordinary shots, explained VFX Supervisor Karen Goulekas; these were director Roland Emmerich shots. “Let’s face it. 150 shots per company is not a big number, but Roland’s shots are long and complex, so it’s deceiving. You stare at them a long time and there is no getting away with things. Luckily we had some great visual effects companies, and considering the complexity of the work, it went pretty well.”

    Previs
    Goulekas decided to skip storyboarding and jump directly to previs. To assist with this, she recruited Nvizage’s Nic Hatch and Martin Chamney and requested a team of 14 character animators, four asset builders, modelers, texture painters, character riggers, and a visual effects editor.

    Tatopoulos Design Studios provided scanned creature maquettes, providing 3D meshes to the team at Nvizage. Using Maya, Photoshop and Deep Paint 3D for textures, and Final Cut Pro for editing, Nvizage went to work, taking care to not lose important details where it counted and honing the designs and textures, especially on the terrorbird and tiger.

    They spent the first month focused on producing a robust set of assets that could be used intensively for months. The first prototype took several weeks, revising the level of detail (LOD) down to every last vertex. Once rigging began, Nvizage trolled various combinations to allow for interchangeable animation clips, and a chunk of time went to rigging and animating the mammoths to get what felt like believable walk, gallop and run cycles. They also devised a real-time game-type strategy where characters and creatures could be animated and played back in near real-time. At first Nvizage considered a software like Massive or AI-implant, but they revised that when they learned the expected number of mammoths was only 18. However, that number grew- and grew- and as scenes began to slow,
    image1
    quote
    image2Nvizage began producing a medium LOD and reduced texture resolution for wider shots. That solved the problem- until director Roland Emmerich wanted more mammoths.

    Martin Chamney came up with the elegant solution of a simple geometry caching system and developed a MEL script utility dubbed MAL (Multiple Asset Loader), which generated geometry caches that could be sourced and applied to any number of creatures in the scene. MAL allowed the animators to auto-load any number of referenced assets that hooked to a time warper function. This automatically generated a repeating animation curve for each asset with automatic time phasing.

    “This allowed the animator control to shift the animation phase and speed of movement,” Hatch explained. “The cool thing about this system was that with a few clicks of the mouse, you could load and position a hundred mammoths all galloping, or grazing, and all running off one or two simple animation caches.” The toolset was also streamlined into a system that allowed slated and burnt-in information to come straight from Maya, through hardware, to the editing suite. There were no software renders to wait for, no slating procedure to slow them down and no 2D conversion to take place before the shots could be edited, allowing Nvizage to deliver unprecedented numbers of shots in a short period of time.
    Once the rough motion was in place, Goulekas began working with the editor, cutting together shots and working out the camera. “It was fantastic, that’s the way to go,” commented Goulekas. “You can do it so fast. I would just sit there with my editor and artists would send shots in. We would do everything from a bird’s eye view and break it up into beats.” Those shots were then used for bidding.

    Nvizage next brought their magic to location, traveling to London with six people on set doing previs and postvis, reworking the previs assets onto the plates. In New Zealand, a dedicated team of ten artists accompanied Goulekas to location where they worked on the Giza sequence. A second wave of animators worked post-vis at Table Mountain Studios in Cape Town, South Africa, tracking plates and making tweaks the previs animations, piecing together the final layout plates.

    These animations were composited onto plates before being sent to the editing suite. In all, Nvizage spent a whopping 69 weeks working both in house and on location doing pre- and postvis. To remain on the project, two of Goulekas’s “previs stars” along with many on the pre- and postvis teams rolled over into production. Animation sups Greg Fisher joined MPC, and Rob Hammings and several artists joined D’Neg, a move that helped to maintain continuity of the shots throughout the rest of production.

    image 3
    image4The hunt for the perfect Mammoth
    MPC’s VFX Supervisor Nicholas Aithadi, Guillaume Rocheron as CG Sup, and Greg Fisher supervising animation were tasked with the opening mammoth scene, with babies, youths, females, males, and specific lead mammoths. Tatopoulos created the original mammoth maquette, but when MPC began production they realized Emmerich wanted something different than the original design, so the mammoth had to be entirely redesigned; the density, color, and length of the hair, the placement on the body, the proportions of the legs, the ears, mouth, and even the tongue. They spent six months on that process, working in 3D to allow Emmerich to see the full vision in its entirety.

    Here the challenge was the length of the mammoth hair. The 150 mammoths stood 18 feet tall with hair 3½ feet long. The hair was so long that it wasn’t something that off-the-shelf software could handle, so MPC wrote their own system called FURTILITY, a project led by Damien Fagnou, the Lead R&D. The stumbling block was the number of vertices needed for the hair to bend properly; the higher the points the more expensive the simulation. To create the net, made from fur of previously killed mammoths, MPC twisted the hair into a rope, adding grooming tools to FURTILITY, that allowed them to braid the hair around the geometry.
    next page
  • 10,000 bc oage 2
    Next to tackle was the interaction of the mammoth hair and the net hair. Using SyFlex cloth simulation, the Grooming Team applied hair dynamics to the mammoth fur, causing the fur to react to the net, and the net to push the mammoth fur. “We used dynamic simulation and also occlusion masks to drive the movement of the mammoth hair,” explained Aithadi. “The closer the hair was on the mammoth, the darker the mask was. Darker areas signified that the net was close to the hair.”

    MPC couldn’t have 150 mammoths with the same density hair as the hero mammoth, so they had to find ways to reduce rendering times on the crowd mammoths without changing the look. They used their in-house cloth simulator, Alice, to manage the fur. “Hair is so unstable, if you change the thickness of hair you will change the look of the entire animal, so we had to work a long time to get the original mammoth to match the crowd mammoth,” said Aithadi. MPC also had to contend with mud sticking to the fur and the net, plus dust and debris kicked up by the mammoth’s movements. “The dust was tricky because we didn’t want the dust to look like smoke which is really easy to get when you are dealing with that size. So we had to do a layer of smoky dust and inside of that add a layer of gritty, speckled dust that was a little more solid. Then inside of that, we added chunks of debris that would lift up from the ground.”

    Emmerich also added a shot with hero character D’Leh, bare-chested and full framed, running in the middle of the mammoths, that had to be created entirely in CG. Since the digital doubles were built to work at 1/3 screen, the hero digital double had to be rebuilt and re-rigged with muscle, skin jiggle, and facial expressions, for two shots. “We were really happy to see it actually work! We weren’t confident of that in the very beginning because the shots came out of nowhere and was towards the end of the project and we didn’t have time to plan,” laughed Aithadi.

    “At one point of the project we had 180 people on the crew with all kinds of disciplines, a big crew for a 150 shots.” The hunt sequence, at 100 shots, took 14 months to complete, running parallel with the Giza sequence, and was the biggest of the two MPC jobs.
    image5
    image6Rattling the Saber
    D’Neg was charged with the terrorbird and the saber tooth tiger scenes, along with the wide shots in Giza. For Goulekas, the tiger in the pit filled with water was the most difficult scene in the entire film. To accomplish this, they shot water in a tank using a blue stand-in tiger for size and position framing, pinning it down with sticks and logs, represented the area where the CG tiger would be, “but 9 out of 10 we had to paint the logs out and replace them, because the CG tiger is struggling and the branches have to move relative to his motion. We were replacing the entire water surface or at least part of it, and had it interact against the tiger fur. D’Neg did a great job on that.”

    Comically, not everyone immediately appreciated the effort. One of the tribe actors timidly approached the producer to admit he didn’t think anyone was going to believe it was a real saber tooth tiger. It turns out the actor was referring to the blue silk screened version, thinking that was going to be the final effect.

    Jesper Kjolsrud, D’Neg’s VFX Supervisor, worked on film from February 2006 until the project wrapped, and said the tiger development went on almost to the very end. There was a lot of R&D to solve issues like muscles and skin deformation. Though D’Neg had done water before it was mostly large scale, but the water for 10,000 was close up, splashing around, with full on simulations. To integrate the tiger, D’Neg wrote fluid simulation tools that deformed the fur into floating. To make the wetness read correctly they worked on clumping the fur. As the tiger struggled in and out of the water the matte would fall on and off too; when the matte was fully on the tiger was fully clumped, and the fur would relax over time. The tiger’s fur and specularity also reacted to rain. “We came up with loads of different looks for this wet fur, and filmed some tests with synthetic fur to see what that looked like, and there were a lot of tools to integrate. There were two development processes going in parallel; the fur dynamics with the fur team and the work done by the water team.”
    Ramping up the crowed at Giza
    The film hits its climatic end in the Giza sequences, a workload shared by MPC and D’Neg, with MPC handling the medium and close up shots and DNeg handling the wide shots. Of the 50 Giza sots, roughly 90% of them have a CG environment, and five or six are entirely CG.

    “Giza was a gazillion elements,” said Goulekas. “People, smoke, dust, blocks, slaves….” She hired Joachim Grueninger from German company Magicon to build a practical miniature of all the pyramids, the palace, 20,000 blocks, scaffolds, and the Nile River at 24 scale. It was assembled in Namibia, right next to where they were shooting the live action. “We wanted the dunes to be really tall. This way we got the dunes and the sky for free, and we could shoot with the same sun angle as the main unit.

    The data wranglers from the main unit would keep track of time of day and lens, but also would measure the distance to points that were similar on the practical and miniature sets.”One person was tasked the job of scaling everything down to 24 scale and making new plate data sheets that would go to the miniature unit they could figure out the schedule for morning, afternoon, and sunset shots.

    Thousands of digital characters were created at D’Neg using Massive, and MPC used their in-house software, Alice. Plus there were approximately 50 stampeding mammoths that had a different set of challenges from those in the hunt. These were working mammoths, used to build the Giza pyramids.

    They were kept in captivity, not well fed, and were forced into heavy labor. Skin was scarred and burned from the sun and the hair was patchy and worn from the harnesses. This required a completely different model that had to be designed with care, since less hair could resemble an elephant.
    caption3
    caption4
    image7
    Related Links
    Karen Goulekas
    Moving Picture Company
    Double Negative
    Nvizage
    MachineFX
    SenateFX
    Tatopoulos Design Studios
    Audio Motion

    Discuss this article on CGTalk

    The harnesses were simulated in Syflex and had to interact with the mammoth and hair. The mammoths were harnessed together in groups of four so that when one moved, it would pull the mammoth next to it and the mammoth behind. MPC built a rig with boundaries to help the placement of additional surrounding mammoths.

    Each harness was made from roughly 150 objects, including chains and leather straps that needed to interact with each other and the mammoth wearing it and the mammoth next to it, complete with skin jiggle. And as with the net, it had to interact with the fur. Nicolas Aithadi shuddered “Oh, it was a nightmare!” with a laugh.

    50,000 animated slaves still needed to be added. It was going to be easier to integrate CG crowds, especially for shadow casting, match moving, and roto’ing the blocks. So MPC’s CG Supervisor Guillaume Rocheron was tasked with rebuilding the entire miniature as a 3D model. Gary Brozenich, the set supervisor, took half a terabyte of photos of the environment and MPC rebuilt the environment in low resolution.

    Rocheron explained; “We did an extensive photoshoot of the miniature, and used Isis, a tool we wrote at MPC, for photogrammetry and geometry reconstructions from multiple pictures and a LIDAR scan for the miniature. With this, we were able to rebuild the entire environment.” It looks good enough to pass in a few shots that were entirely CG. To populate Giza with slaves, Goulekas arranged for a huge three week motion capture shoot, “which I’m told was the largest set up in Europe, with 50 cameras. For three weeks I was directing actions that I knew we needed. Audio Motion was the company, Mick Morris was the company owner, and they did a great job. I was thrilled with all the companies; I’d work with all of them again.”
     more

blog comments powered by Disqus