• CGSociety :: Technology Focus
    22 February 2011, Paul Hellard

    "The value of being able to walk out into a space and explore a virtual world by physically controlling a camera, finding the perfect angles and then shooting multiple takes with the OptiTrack Insight VCS, is truly remarkable. All of this is achieved in a fraction of the time it takes to key frame a single camera take, and the results are more natural."
    - Justin Denton,Previs Supervisor, HALON Entertainment.

    One day in Montreal at ADAPT 2007, HALON Entertainment's owner and Previsualization Director Dan Gregoire showed his previs system he'd bundled together. This was a compass, a mobile phone, all feeding info to his laptop. All useful bits of technology, not yet collected into a practical off-the-shelf product.

    Everyone has also seen the videos with James Cameron moving around the AVATAR set with his Virtual Camera. That one had a Panasonic HiDef screen and an SDI transmitter Velcro-ed onto the back of it. That one was a brilliant one-off.

    The OptiTrack Insight VCS is a production-grade system, ready for the market. CGSociety talked to the Previsualization Supervisors for AJ Briones and Justin Denton from Halon Entertainment about the technology they used on Halo: Reach.

    Justin Denton, HALON Entertainment, Previsualization Supervisor

    "We realized we had to move forward," said Denton. "We knew MoCap would speed up our pipeline.

    At HALON Entertainment, our focus is still on cameras and we understand that to get that human element of someone holding a camera, while it can be done with key-framing, it is a pretty painstaking process.

    It's a fine art to make it feel like there's a cameraman actually holding that camera."

    The HALON Entertainment group took a look at all the different motion capture systems out there. This was in Spring of '09, when there weren't any actual products out there. "Sure, Jim Cameron was doing AVATAR back then with a system created for the shoot, and Jim Henson Studios has been using their version of a Virtual Camera since the 90s," explains Denton.

    "The big studios have had proprietary solutions but there weren't any acceptable products out there for smaller companies. We'd done all the trade shows like SIGGRAPH and GDC looking for a smaller, cheaper product that would fit for our kind of work. At the time, OptiTrack had a very cost-effective actor-based motion capture system starting at around US$6,000."

    AJ Briones was the Virtual Cinema Lead Artist on AVATAR, and Previs Supervisor on 'the Halo:Reach commerical. He has built and managed MoCap studios for Midway Games and Vivendi Universal Games.

    "Back then I'd use just anything [as a virtual camera]," quips AJ., "in fact back then, we were seeing the kind of Virtual Camera being used by Cameron on AVATAR and a lot of elements have been brought into OptiTrack products that they didn't have back then."


    AJ Briones, HALON Entertainment, Previsualization Supervisor
    AJ used an earlier version of the OptiTrack Insight VCS virtual camera system on the Halo:Reach commercial spot. "I couldn't step between prime lenses," Briones says. "I had to get close to what lens I wanted and just nail it down in post.

    But I had the viewfinder and could control scale and platform myself to moving objects, which is similar to what we had on AVATAR."
    AJ says he also had playback and recording controls at his fingertips and could step through a scene frame-by-frame. "I didn't have to rely on an operator," he says.

    For Halo:Reach, the introduction of the OptiTrack Insight virtual camera system was a watershed moment. They could now create a scene with certain actions built into it and that would be AJ Briones' master file.
    He could go in and gather his story board shots using this complete product. "From there I could go off the beaten path and just go and find stuff," explains AJ. "In Previs, having the Insight VCS makes things happen that just weren't possible before.

    Earlier, when you got a story idea or script pages, you'd have to figure out exactly which shots were needed, and you might have just enough time to complete them by deadline.

    Now, you take these same story elements and build more comprehensive master scenes, very much like what you'd be doing on a live action set, and you can go in and shoot as many shots as you want in a very short amount of time.

    This is much closer to how production works on a live action set. They get to shoot until they run out of money and we get to shoot until we run out of time. Coverage is really the name of the game in our new shot work-flow paradigm."

    'Halo:Reach' © Microsoft

    The dataset would then be pulled into both Maya and Motionbuilder. HALON Entertainment first used an alpha version of the Insight VCS using OptiTrack data as a virtual camera tool in Maya. Capturing motion in deeper detail is also possible at up to 250 frames-per-sec. The OptiTrack line can offer a complete set of tools to integrate in the virtual production pipeline, including body and face mocap in addition to the virtual camera system that HALON utilizes.

    HALON Entertainment has an OptiTrack motion capture stage of 30 ft. by 25 ft. at their home location in Santa Monica, and they also have road kits that can be taken out on location. One recent example was an Insight VCS stage in Baton Rouge, Louisiana, which was set up and ready to use in less than half of one work day.

    In addition to Halo:Reach, HALON Entertainment has used the Insight VCS on the feature films John Carter of Mars, directed by Andrew Stanton, and Battleship, directed by Peter Berg, and continues to use OptiTrack products on many of its major feature film and video game projects in action now.

     Go to page 2


    The main goal was to create something that is useful instead of something that is just technologically cool. NaturalPoint concentrated on making a system with the lowest latency possible, with realistic controls and very good video display. Basically so what is on the video display is the representation of the virtual world.

    "We decided the best way to go about this would be with a wired system via an umbilical cord which can be up to 100 feet long, which carrying signal and power," explains NaturalPoint Co-founder and Lead Engineer for the OptiTrack motion capture line, Jim Richardson. "No batteries required."
    This system works as a video and command relay from the PC up to the Virtual Camera and it relays HiDef video at 1280x768. There are also control mechanisms, all USB-based.

    There are a total of four USB ports there for control sticks. These controls can be personalized as well. A follow-focus kit can be used, a zoom toggle in a different form factor, in fact, anything can be plugged in to use here, and as long as it is DirectInput compatible, it'll work with the Insight VCS plugin.

    This makes it very easy for people to extend the system on their own. The HALON Entertainment people wanted a collection of zooms and prime lens effects to be available with their rigs.

    "All the major motion systems are available in this system," explains Richardson. "Global scaling, zooming, panning and tilting. Also there are deck-playback controls to the monitor. Each can be set up to operate the plugin that comes with the Insight VCS kit.

    The Insight VCS 'knows where it is' at the start of a session, because you start out with a rigid body that has the markers that are attached to the camera system. You create a Z-axis and thereby you have calibrated your motion capture volume.

    If you keep those two in alignment, you will always be within the same coordinate system as your capture volume.

    You can also just dolly yourself around with the joysticks. With these joysticks or toggles, every cinematographic movement is available. Zooming, panning, dolly moves and scaling.

    There are six different button controls which can be allocated into plugin functionality. Global moves including Global Scaling are also available.

    There are motion-dampening functions that allow your moves to resemble flying through chasms in a helicopter; then again there one to mimic the cinema veritaé look.

    A lot of the effort went into the creation of the plugin. We have live plugin control for both Maya and MotionBuilder. All it needs is the rigid body tracking data, which you can get from one of OptiTrack's motion capture software packages, ARENA or Tracking Tools. There is a universal version of the plugin that allows any motion capture system to work with the Virtual Camera. OptiTrack Software Engineer Morgan Jones is the inventor of the plugin and worked on a lot of the development of the mods for working with Maya and MotionBuilder, for its use with the Virtual Camera.
    "Tracking markers can be placed on any object and in our case we have the Insight VCS Pro with dedicated thumb stick controllers. The virtual camera can be placed anywhere in a capture volume and you're pretty much good to go," explains Jones.

    "The MoCap system streams the position and orientation information across to Maya or MotionBuilder in real time. In these apps, tracking position is mapped directly onto the camera's position, and the user can map controller buttons onto different camera properties."

    ARENA MoCap

    "The Insight VCS, tracked with optical motion capture, provides an absolute coordinate system that marries your virtual and real world together. Scaling and offsets can allow users to have a 'hand held' look, or zoom around a city, that is the benefit of a virtual world. If you want to add effects, like smoothing, or special zooms, you can easily define these motions in custom curve settings," explains Jones.

    ARENA MoCap

    Jim Richardson, NaturalPoint Co-founder and Lead Engineer.

    About 14 years ago, a cousin of Jim Richardson had a debilitating accident and became profoundly disabled. "He couldn't communicate verbally anymore at all," explains Richardson. "One thing he could still do was move his eyes. His injury was through the speech center in his brain." At the time, there weren't any eye-tracking solutions available. Even though Jim was still in high school, he set out to make his own. "A camera would watch the eye and a cursor could then be moved on a virtual keyboard on a screen," says Jim. "My cousin could then type with his eyes."

    These days, Jim Richardson's business is impressing more than just the medical field. Nvizage bought the Insight VCS and the OptiTrack Motion Capture system on the strength of a demo they saw at SIGGRAPH 2010. Martin Scorsese is using the Virtual Camera on his latest film, as well as the crew of 'Clash of the Titans'. Microsoft Games, USC film school, even the people who have VICON, have bought one.

    HALON Entertainment
    Halo:Reach Trailer comparison reel

    Discuss this article on CGTalk

    Previous pageMore Articles

blog comments powered by Disqus