• We interviewed Ensemble Studios Technical Director Christian Aubert.
    Starting out in the early 90s, Christian Aubert worked as a TD/CG artist for several entertainment companies in Montreal and Quebec City. He moved to Los Angeles in 1999 to join Digital Domain as a Technical Director, where he worked on "X-Men" and "Red Planet" along with a number of commercials for Dodge, Autotrader, Ericsson, Motorola and American Express.

    He then moved on to a TD position with Base 2 Studios in Santa Monica, where he worked on the Steven Spielberg film "Minority Report", commercials for such clients as EMC and Hyundai, and music videos for Destiny's Child and Dave Mathews. Christian joined up with Frag-Ment as a Senior TD in August 2001 to work on the "Age of Mythology" game cinematic for Ensemble Studios.

    3D Festival: Can you tell us a little about Frag-Ment and your responsibilites in the creation of the Age of Mythology cinematics?

    Christian: Fragment is a full service film, commercial and cinematic animation studio. We're trying to play to our strength, which is character animation.

    I was technical director on the show, which meant pretty much all shots involving dynamics, crowds and general software and scripting requirements. I was also responsible for our production pipeline.

     Next Page (page 2 of 5) >>
  • 3D Festival: Can you elaborate on what methods were used to model and animate the characters?

    Christian: Pretty much everything was done using subdivison surface in LightWave, with the exception of the super lores characters used in the crowd shots, which were straight poly models. The models were then brought into Maya to be rigged and animated.

    3D Festival: How were the special f/x handled and what software was used to accomplish them?

    Christian: Obviously, our workhorse applications are Maya and LightWave, so we did as much as possible there. Most of the dust effects were done in LW, and Kronos particles were done in Maya. The meteor sequence was combination of LW hypervoxel and Illusion particles. Particles in LW were a lot more interactive than Maya, with the amount of shots we had to do it was a time saver.

    You do have more control using Maya particles, but most of it is through custom expressions and it became really slow when you have thousands of particles in the scene.

    3D Festival: Can you tell us about the lighting used throughout the cinematic? It appears Global Illumination was used in the battle scenes, is this correct?

    Christian: The foreground characters were all rendered with Global Illumination. The background in the temple sequences were rendered with GI baked into the models, whereas the background in the battle sequences were GI.
    Rendering the foreground and background separately is a common practice in highend film production, it also give us more control in compositing.

    Lighting in most of the shots were really simple, with only one or two key light and GI takes care the rest.


    Click the above to view a larger image

    Click the above to view a larger image
    << Previous page (page 1 of 5)
  • 3D Festival: Can you give us some insight on the crowd system that was used between the two armies?

    Christian: I'd done some research into flocking before, so I expected to write something from scratch. Then I heard that Paul Kruszewski was coming out with the AI.implant plug-in for Maya. We did a lot of pre-viz with that plugin, and it worked fine until we tried scaling up our scenes from 300 to 3000 characters. The plug-in, which was in beta at the time, just could not handle the amount of data we were trying to push through. They finally solved the performance issues, but it was just a few months late.

    We ended up going with good old particles, combined with some custom code that let us attach various characters and motion files, and then add some variation to individual characters size, texturing, etc.

    3D Festival: In retrospect is there any aspect you would improve upon in regards of the crowd system used?

    Christian: There was no provision in that software for state changes, such as when a character comes into contact with another and then goes into battle.

    We either had characters running, or fighting. But if I were to start from scratch today, I'd probably just use the current version of AI.implant, which has become pretty fast and stable.

    << Previous page (page 2 of 5)
  • 3D Festival: At the end we see the tomb where our hero was venturing started to cave in, can you elaborate on how this was done?

    Christian: That was basically all Maya rigid body dynamics. Everything that had to crumble was pre-scored, and proxies were made to speed up the dynamics simulation. Then, it was just a long process of tweaking variables until we had the result we wanted.

    3D Festival: What was the most difficult task to accomplish in the cinematic and why?

    Christian: Even though the degree of difficulty in getting most of the crowds shots wasn't that high, the amount of data we were pushing and the level of coordination required to make everything work smoothly made them difficult. The crumbling columns at the end were also a big challenge (but in retrospect, also a lot of fun).

    3D Festival: How long was the production time for the entire cinematic from start to finish?

    Christian: Production was about 8 months from start to finish, with a couple months for pre-production.

    3D Festival: Can you tell us what's your favorite scene was in the cinematic?

    Christian: I think the coolest shot is the one where we can see all the myth units, then all the humans runnings towards each other and engage into battle.

    << Previous page (page 3 of 5)
  • 3D Festival: What software and hardware were used for the effects in the cinematics?

    Christian: On the 3D side, we basically used Maya as our animation tool and LightWave as our modeling and rendering tool, with TheBeaverProject handling back and forth transfers. Compositing was done in Combustion.
    We had Dual P4s as workstations, and Dual Ahtlon MPs on the render farm. All boxes had 2 gigs of RAM, which was required because of the amount of geometry we were trying to push, and the fact that we don't use multi-threaded rendering on dual-proc machines.

    3D Festival: Can you tell us about some of your upcoming projects?

    Christian: We're looking into taking computer generated characters to the next level. On this project, we modeled our characters based on real life models. Most people thought the Arkantos character looks like Ben Affleck even though his real-life counterpart doesn't resemble Ben in the slightest bit. We were also able to capture a lot of human expression through a technique we developed using Cyberscan data as templates. There were no talking scenes on this project but the audiences are able to feel what the characters are going through from their expressions. On the next project we like to improve on what we have developed and integrate talking characters in our pipeline.

    3D Festival: Thank you very much Christian it was a pleasure. We look forward to seeing more great work from Frag-Ment in the near future.

    Christian: It was a pleasure!

    Discuss the Age of Mythology on CG Talk - Digital Effects Professionals

    Links
    Ensemble Studios
    Frag-Ment
    Ensemble Studio - Age of Mythology development website
    Microsoft Game Studios - Official Age of Mythology website

    Interview: Tito A. Belgrave
    Images: Ensemble Studios/Microsoft, Frag-Ment

    Special thanks to David Lo, Christian Aubert and B. Ian Hayden of Frag-Ment.

    << Previous page (page 4 of 5)

blog comments powered by Disqus