• Super CG - Superman Returns
    Burst
    a bird! It’s a plane! It’s geometry, texture maps and shading algorithms! Although most people watching Warner Bros. ‘Superman Returns’ might guess that Superman seen flying from a distance is a digital double, few will believe that in many of the close up - very close up - shots, the superhero is a digital character.

    One reason Superman’s double looks exactly like actor Brandon Routh is that in a way, it is. Sony Pictures Imageworks used the LightStage 2 system developed by Paul Debevec to capture photographs of Routh’s face that they applied to a 3D model. Easier said than done.


    “It’s not just a matter of shooting imagery,” says Rich Hoover, visual effects supervisor for Superman at Imageworks. Hoover led a team that created around 300 shots for the film including close-ups of digital Superman used throughout the film including a dramatic shuttle/airplane rescue. Bryan Singer, who directed 'X-Men' and 'X-Men 2', directed the film along with Mark Stetson supervising the effects. All told, 11 effects studios worked on 'Superman Returns' under the supervision of Stetson. Framestore CFC handled the digital Krypton environments. Rhythm & Hues built Superman’s Fortress of Solitude with the Jor-El (Marlon Brando) hologram, and maneuvered a sea rescue.
    FortressFortress
    Rising Sun created a young digital Clark Kent skipping over a cornfield. The Orphanage helped Superman fall from space, bounced bullets off his body, rescued Kitty during a car chase, and added a mansion to the waterfront. Frantic Films developed a crystal-growing algorithm that other studios used and created the opening outer space shots.

    Photo VFX, Lola Visual Effects, and Eden FX fixed eyes, costumes and cosmetics. The Pixel Liberation Front prevized the shot. And, New Deal Studios constructed miniatures.
    Miniature
    Bullets
    FaceFace Off
    Imageworks’ supplied its Superman to the other studios who used it when the digital double moved close enough to camera to be recognized. “I had been at Sony working on 'Charlie’s Angels', so I was aware of the work Imageworks had done, especially on 'SpiderMan 2',” Stetson says. “There are some impressive close-ups of Alfred Molina [Doc Ock] that people missed, but I didn’t miss them. I wanted to take advantage of that technology.”

    For 'SpiderMan 2', Imageworks had used the LightStage 2 system at the University of Southern California’s Institute for Creative Technologies to capture Molina’s face. They used the same system for 'Superman Returns'; however, they tweaked the acquisition techniques.
     Page 1Page 2Page 3Next page
  • Super CG - Superman Returns
    Face With the LightStage 2 system, Routh sat in a chair while a rotating semicircular arm with 30 Xenon strobe lights swung around him every eight seconds. As the arm moved and the lights strobed, six synchronized Arriflex movie cameras positioned around Routh’s head - two more than for Molina - shot footage at 60 simultaneous frames per second. Because 60 fps doesn’t quite stop motion, the crew bolted the cameras down and braced the actor’s head and neck. “It’s difficult for [people being captured] to hold still, but the more they are still, the sharper the capture,” says Hoover.

    By blending all the images taken by the six cameras at one moment in time, the crew created one texture map to wrap around a 3D model of the actor’s head. “I hesitate to call it imagery, although it is an image,” says John Monos, CG supervisor. “We have algorithms that extract the reflective data stored in the captured images, so the map represents the reflectance of the skin.”

    The process produced 480 reflectance map images, 70 Gigabytes of textures, that wrapped the model according to the lighting in a shot. When lighting technical directors (TDs) positioned lights in a scene, a system developed by Imageworks automatically brought in the map with the matching lighting. In other words, put simply, if a TD shined a light on the left side of Superman’s face, the system found a blended map of images shot by the six cameras when the strobe lights pointed at the left side of Routh’s face.

    Because the man of steel didn’t have a frozen face, Imageworks developed algorithms to manage the reflectance data as the face moved - as animators changed digital Superman’s expression. Animators couldn’t work interactively with the photoreal faces. Instead, they had simple shaded models in Maya. But, an Image Based Rendering (IBR) tool provided feedback.“

    Part of the difficulty in using IBR is having coverage,” says Monos. “If Routh’s eyes were looking one way, but the animators moved them the other way, it might reveal a portion of the eye that didn’t have good reflectance data. So, we used both IBR and traditional rendering.”
    Faster Than  A Locomotive
    In addition to photographing Routh’s face, the crew motion-captured his body and facial expressions to help animators create the double’s performance. Although Imageworks captured basic body positions - running, walking, arms stretched forward, and flying - they concentrated more intensely on facial capture for close ups of Superman’s head and shoulders. Digital Superman doesn’t talk in the film, but he could have. For facial capture, the team used techniques developed by Imageworks for 'The Polar Express' and 'Monster House'. For scenes of Superman flying, Routh “flew” on wires rigged on a 100-foot long greenscreen stage. Often, though Imageworks’ digital Superman replaced the greenscreen footage in final shots - sometimes completely, sometimes partially. Even so, the greenscreen footage provided reference. Jones also turned to Alex Ross drawings for inspiration. “Ross does comic book poses, but his drawings are from real life,” Jones says. “It was definitely a challenge working out what Superman’s flying pose would be, what happened to his body when he flew. He’s a man of steel, but if he looked too stiff, he’d look CG.”

    Although Superman was sometimes real and sometimes digital when he flew, his cape was usually digital. “They couldn’t get the wind right in the greenscreen room,” says Jones. “If it blew too hard, Routh would squint and get bloodshot eyes.” Moreover, the digital cape could be art directed - even when Superman flew at 1,200 mph. For cloth simulation, the studio used Syflex software. To art direct the cape, animators blocked out poses that became targets for the simulation, and a team of technical directors led by Takashi Kuribayaski developed a workflow to sculpt the physics. For hair simulation - Superman’s and digital Lois Lane’s hair - the crew used in-house tools.
    Cape
    FlyingGreen Screen
    Cityscape
    The Daily Planet
    Metropolis
    In addition to creating digital doubles, Imageworks gave Superman a digital city to fly through, the Daily Planet office building for Clark Kent and Lois Lane to work in, and, in the dramatic opening sequence, a shuttle, an burning airplane, and a baseball stadium. For aerial shots of Metropolis, the crew mapped photography onto geometry, but they also built a digital city within the city for close-up shots. “We created a city grid around the Daily Planet that’s one hundred percent digital,” says Hoover. “The ground, the cars, everything. Then we situated that into Manhattan as we know it and made all the streets work.” For the digital city, the modelers revamped some buildings from 'SpiderMan 2', but they modeled the Daily Planet from scratch, matching and greatly extending a two-story set and also a rooftop set.

    “The Daily Planet is the largest building we have ever built here,” says Bruno Vilela, CG supervisor. “It’s 908 feet tall with a 30-foot globe on top, and it isn’t symmetric. We couldn’t build one façade and then replicate it.” Textures painted in Photoshop and Body Paint added details to the complex geometry; the building has no displacement maps. As with 'SpiderMan 2', a proprietary rendering interface, BIRPS, provided artists with a way to handle the massive amount of data and move it with assigned shaders and lights through RenderMan.

    Modelers also customized the surrounding areas to create a city that resembled New York, but was not New York by removing such landmarks as the Statue of Liberty and the Chrysler Building and changing the bridges. And then, having built Metropolis and the Daily Planet, Imageworks destroyed it. “I think that the destruction of the Daily Planet pushed the edge of what we’re doing here in terms of simulation and compositing,” says Vilela.

    In one shot, the globe on top of the building rolls off its base and crashes against the building as water flows down the building. Imageworks created the entirely digital shot with an assist from Tweak Films, which simulated the water.
    To destroy solid material, the Imageworks team used Maya and rendered the elements with RenderMan. For smoke and dust, they used a hybrid pipeline that moved data from Maya through Houdini into RenderMan or into Imageworks’ own Splat renderer. The latter pipeline also wrangled smoke and fire for the shuttle destruction scene.
    Previous pagePage 1Page 2Page 3Next page
  • Super CG - Superman Returns
    Plane Wrestling
    In the shuttle destruction shot, an airplane with Daily Planet reporter Lois Lane along with other members of the press corps inside and a space shuttle on its back, heads away from earth. Rather than rocketing from Cape Canaveral, the astronauts will release the shuttle from the 777 when they reach 50,000 feet.

    But, the couplings don’t open, the shuttle fires anyway, and Superman flies to the rescue. He disengages the shuttle, throws it into space, and looks back to see that the 777 has caught fire. When he grabs a wing, it breaks off. Then, the other wing breaks. He chases after the plane as it plunges toward a baseball stadium and catches it by the nose barely in time. The crowd cheers as he jockeys the plane onto the ground. Superman is back. Only a few baseball players, 30 extras used for lighting reference, and the ground are real in this sequence. But for a few close-up shots of Superman, everything is digital.

    Plates shot of a peninsula near Washington DC provided the land. For the stadium, Imageworks projected photos onto a model created from a Lidar scan of a stadium and remapped the city beyond. They packed the stadium with digital people animated with around 70 motion cycles that they controlled using a custom Houdini-based solution. The fire, smoke, ocean water, airplane, shuttle, and clouds are all digital. Modelers built the 777 so that it could easily break apart with blend shapes and rigid body simulations. “The plane flexes quite a bit,” says Monos. “But, once it breaks, it becomes more rigid.”

    As Superman wrestles with the plane, deformation and muscle systems in the rigs helped animators make the shot convincing. “We could sculpt him with sliders here and there to make sections of his arms more defined and to tighten his abdomen as he grabs something,” says Jones. To create the smoke and fire emanating from the plane, effects artists started with fluids simulated in Maya. Then, to manipulate the fluids further, they moved particles driven by information from the simulation into Houdini to add resolution and turbulence.
    Lifting
    Space
    Office
    Shoot
    Discusion
    “Houdini is a good hub for mixing and matching sims from different sources,” Vilela says, “So we didn’t have smoke blowing from one direction and flames from another.” When the smoke and fire elements needed to interact with the plane and debris, RenderMan handled the reflections and shadows. When the interaction was limited, Splat sufficed. But, of all the effects created at Imageworks, Vilela believes the clouds were the most challenging.

    To make it possible to art direct the clouds, the crew started with Maya models - simple geometry - that, once approved, moved into Houdini where the geometry turned into point clouds with thousands of particles. Once in Houdini, the effects artists groomed the clouds and assigned shading properties that changed the density of particular areas. “Houdini is our bridge to RenderMan, so we were able to customize shading groups through the clouds,” says Vilela. A clustering RenderMan DSO converted the thousands of points in the Houdini clouds into hundreds of millions of particles and made it possible for Superman to fly through the clouds with the camera trailing behind and god rays shining through. “It’s beautiful,” Vilela says.

    Effects such as this helped create the emotional impact the director wanted. In one 900-frame shot, for example, Superman flies through the clouds right up to the camera. “He smiles and enjoys the moment,” says Jones. “It was great to make it feel as epic and big as Bryan [Singer] envisioned it.” At one time, digital doubles replaced stunt doubles in shots that were too dangerous or impossible for humans to do. The first digital double was, arguably, Batman in the film Batman & Robin, which released in 1997, created to help Batman take a 60-foot tumble. An effects team at Pacific Data Images (PDI) created the stunt double.

    John Dykstra was the visual effects supervisor and in 2004, he won an Oscar for best achievement in visual effects for 'SpiderMan 2', for which Imageworks pushed the state of the art by using LightStage 2 to duplicate Alfred Molino’s face for the digital double. Now, they’ve done it again. Imageworks created the multiple extreme close-ups of Superman’s face for stunts which appear during flying scenes. But, because they occupy half the screen and look so much like Brandon Routh, they prove that digital doubles can do far more than stunt work.
    Related Links
    Superman site
    Warner Bros.
    Sony Pictures Imageworks
    Light Stage 2
    Previous pagePage 1Page 2Page 3More CGS Articles

blog comments powered by Disqus