• Creation of the long-awaited Nordic tale generated months of work
    for hundreds of VFX artists.
     

    CGSociety :: Production Coverage
    18 May 2011, by Renee Dunlop

    Thor was sent to earth by Odin, banished and stripped of his power, but a few digital effects gods were here, readily poised to make Thor’s story a spectacle that was out of this world. VFX Supervisor Wes Sewell was one, brought on board even before Director Kenneth Branagh. Was this unusual? “For big visual effects movies it has been done,” Wes Sewell explains. “I was a Marvel choice, I was presented to Ken, and Ken agreed. I was one of the first hires on the film, after coming off the first Iron Man. I was able to get in on the very early stages with Production Design, getting familiar with Marvel’s ideas.”  

    That was no small task. They finished shooting about a year before release, leaving just ten months for the VFX. “It was bloody hard to get it all done in that amount of time.” In the end they had 1,305 FX shots out of 1,939 total. A full 70 minutes of the film is visual effects, work that was pretty evenly split between Digital Domain (DD) in Venice, California and Buf in Paris. Whiskytree created Asgard, the Norse version of Heaven, the place that Thor comes from. One of Sewell’s favorite characters was a completely CGI creating called The Destroyer, a giant metal beast done by Luma Pictures in California. “They are a magnificent medium sized house that does 2D and 3D,” said Sewell “and did pretty much any of our earthbound stuff.” They did volumetric storms, tornadoes and all the lightning.

    © Marvel Studios, LLC.
     
    © Marvel Studios, LLC.
     
    © Marvel Studios/Paramount Pictures and courtesy of The Third Floor, Inc.

    PREVIS at THE THIRD FLOOR
    Sewell brought on “a robust previs crew” from The Third Floor who handled the previs, tech-vis, and post-vis over an 18 month period under the supervision of Gerardo Ramirez. “We began creating all the big action sequences based on the original concept art, the storyboards, the animatics, and bring it into the world of previs where we have actual cameras, real cranes, and characters in three dimensions,” explained Sewell. Production Design was also working in 3D so Third Floor was able to use those models in the techvis, modeling the stages and placing virtual greenscreens and lighting rigs, “so we had something we could show the studio and Director and say, here’s what the shot would look like previs wise, but here’s what it’s going to be like when we shoot it.”

    By using tech-vis, Sewell was able to provide information on how to set up the physical shoot. “We would also shoot our tech-vis with objective cameras so you could see how the characters would be on set, how we would have to do our setups. What previs/tech-vis allows you to do is it helps you understand the concepts that the director, the studio, and the script wanted to get across and by going through these processes, we were able to build all of the rigs, I was able to design rigs that worked.”

    © Marvel Studios, LLC.
    © Marvel Studios, LLC.
     

    BUF
    Under VFX Supervisor Nicolas Chevallier and VFX Producer Pierre Escande, Buf did all the cosmic backgrounds seen around Asgard and the entire final battle that takes place on the Rainbow Bridge and Heimdall’s Observatory, a machine that can use energies to fire people across the universe. While Thor was a fantasy film it had to have a literal truth to it. The theory was that in order to travel such distances you need to be going at super luminal speeds, faster than the speed of light. What happens is the traveling object stretches into a spectrum, and that spectrum of light is a rainbow. The rainbow bridge had a long design phase to get just right.

    Thor has two rainbow bridges. One is a physical bridge that goes from the middle of Asgard off to the edge of an ocean. It’s a large crystalline bridge about 60 feet wide, six feet deep, made out of Asgardian crystal and energies flow through it. “That was what we were after, but how do you photograph that?” Sewell came up with a concept to build five foot platforms that had one inch plexiglass planks. The Director of Photography put electronically controlled lighting beneath it. “We would get the underlighting and would also get the actors’ reflections in the Plexiglass, but it wasn’t quite reflective enough so we ended up putting a thin Mylar sheet on top. We had massive greenscreens all around the stages which were great for keying off your characters, but when we looked down into the reflections, sometimes we would be looking up into the light rigs and catwalks overhead. We had designed a greenscreen system that was on a series of wrenches and rails we could move into place so they would reflect as well.” All this was then keyed out and rooted by Buf.

    © Marvel Studios, LLC.
    © Marvel Studios, LLC.
    © Marvel Studios, LLC.
      Go to page 2
  •   
    © Marvel Studios, LLC.
    DIGITAL DOMAIN
    The bulk of Digital Domain’s work was at the beginning of the film, a major battle sequence that takes place on a far away planet called Jotunheim, one of the nine realms in Norse mythology and home to the Frost Giants. The Frost Giants and beast and environments where they live, and the Frost Giants that appear in other scenes of the film were all done at Digital Domain (DD). The frost giants were roughly ten feet tall, requiring a lot of scale tricks to show their height compared to the human characters, such as enlarging the chest cavity. They used live action Frost Giants, work done by Legacy, Stan Winston’s old company. The stunt men were dressed as seven foot characters but Chris Hemsworth who plays Thor is 6’4”, requiring DD to make the Frost Giants appear even bigger, one of the methodologies refined during previs and tech-vis.
     
    For the hundreds of Frost Giants and the Frost Beast, DD combined a mocap session at Giant Studios along with keyframe animation. DD’s VFX Supervisor Kelly Port explained. “You always have to clean up the motion capture and keyframe it a little bit just to get the contact points, like during a sword fight or when transitioning from a run to a walk or a run into a fight, any kind of transition, especially when they are fighting a live action character. That was why we had a pretty big animation team. We did a RIB archive system. All of our rendering was done in RenderMan, we used Maya for modeling and animation, we used ZBrush and Mudbox for higher resolution sculpting and maps.”
    © Marvel Studios, LLC.
     
    © Marvel Studios, LLC.

    DD did the foreboding environment work, a very beautiful, very eerie world. It’s a different planet, a different realm, an ice planet, covered with enormous cliffs and canyons, and in decay as it melts from within. Branaugh wanted the vision to appear foreboding, overwhelming, unwelcoming. The artists had to create two Jotunheim versions, one pristine version and another destroyed version. Because the buildings had to be destroyed, DD used a huge RBD (Rigid Body Dynamics) simulation. “In order to fracture the geometry, you pass the geometry through an RBD system, but you can’t have intersecting or overlapping points,” said Port. “We were looking at models that were up to ten million polys. These models were so detailed and specific it quadrupled our modeling time in order to model these things to a specification that would work within the fracturing algorithms.” DD would render a first pass out of lighting with all the textures and maps and displacement applied, then would run an additional matte painting projection pass on top of that to add detail. That was done with in Photoshop, then projected with up to six projection cameras depending on the extent of the perspective change. This all had to work in stereo for the all CG shots. Stereo is more of a hit in terms of render hours. “I think we were close to two and a half million render hours on this film. That is a combination between Venice [California, DD’s main facility] and Vancouver. Vancouver wound up doing 250 out of our total of 337 shots”


     
    STEREO
    Another task of Sewell’s was overseeing the conversion to stereo, handled by Stereographer Graham Clark at Stereo D in Los Angeles. There were around 80 true stereo shots that were entirely CG and rendered in stereo that were not converted, while about 50-60-% was straight conversion.

    “We knew about the controversy of converting to stereo going in,” said Sewell. “In preproduction we had to make a decision about what we were going to do. If we wanted to shoot in 3D there were a number of camera crews that could handle it, but not that many, and probably would have had to extend our shooting schedule by about 25%.” Sewell had spent a good deal of time investigating conversion technology and came to the conclusions many of the failures that had happened were mainly due to time. “I took a leap of faith. I said I know what these technologies are, I know the image analysis, I know the 3D tracking, I know how far it’s come in the last few years, and by the time we get to the post production in our film it will be even more advanced. I said give me the time and I’ll make sure it comes out right.

    All four main vendors had shots that were fully CGI and were simply delivered with two eye renders. There were also hybrid conversion shots, with DD handling most of these. Straight conversion shots were referred to as Type One. Type Two were not converted, they were straight CGI with both eyes rendered. The Type Three shots were a combination of the two such as in one of the scenes with a set built with greenscreens covering areas where the set didn’t cover. A lot of wind, snow, and mist was to be added. The DP had real snow and mist rigs, but Sewell, who earned the nickname The Weatherman, wanted to add most of that in 3D. “If they had covered this whole scene with snow and mist, the conversion would have been impossible or extremely difficult, and perhaps not successful.

    To handle the snow, DD did a composite, finished the shot with all the elements squashed into a 2D film that was finaled. “Then we would pull all the pieces apart, separate out the mist and snow from the backgrounds and Frost Giants and give the elements to Stereo D, who would then do conversions, adding in the Type 2,” said Sewell. With volumetrics or particle systems that were near the characters, Stereo D would convert the original photography plate, and those two eyes were given to a vender to finish the 3D elements. “Really, only half our film was converted. The rest of it we worked in a true stereo world. Because we converted the film to 3D, the conversion technologies are the same tools we use to do our visual effects work. Everything from rotoscoping to paint work to 3D modeling, all these tools are the same tools they use for conversion, so when you convert a film to stereo, essentially every shot in your film is a visual effects shot.”
    © Marvel Studios, LLC.
    © Marvel Studios, LLC.
    © Marvel Studios, LLC.
     

    Related links:
    Thor
    Marvel
    J. Michael Straczynski, story
    Wes Sewell, VFX Supervisor
    Gerardo Ramirez
    The Third Floor
    Kelly Port, VFX Supervisor, Digital Domain
    Digital Domain
    Nicolas Chevallier, VFX Supervisor, Buf
    Pierre Escande, VFX Producer, Buf
    Buf
    Luma Pictures
    Graham Clark, Stereographer
    Stereo D
    Whiskytree
    Giant Studios
    Gentle Giant
    Legacy
    RenderMan
    ZBrush
    Mudbox

    Writer:Renee Dunlop



    Discuss this article on CGTalk

    Previous pageMore Articles

blog comments powered by Disqus