standardhero

Making Order out of Chaos

Wed 14th May 2014, by Rory Fellowes | Production


A few months ago I was in a taxi in Dublin. I always like to talk to my taxi drivers so I asked this man where he was from, since it was obvious he was not Irish. He told me he came from Bulgaria. I said I had always wanted to visit Sofia, to which he replied that he was surprised that I knew the name of the capital city of his country. Most people, so he said, did not. I hope that after reading this article, you will all know and remember that famous and beautiful city, if only because it is the home of the world renowned Chaos Group, creators of one of the most successful render softwares currently in use in the CG industry, V-Ray.


The Chaos Group was co-founded in the late 1990s by Peter Mitev, the current CEO and Vladimir (known as Vlado) Koylazov, the Lead Developer. “It’s the classic tale,” Vlado told me. “We met in college, at Sofia University. I was a freshman there [in 1997], studying computer graphics and Peter was in the same group of students so we had the same classes and we saw each other. And that’s it really. We had a sort of computer lab, and because I knew most of the stuff I was spending my time in the labs working on a very simple game engine that I wrote when I was in high school. Peter saw that and basically, asked me if I wanted to work together on computer graphics software and that’s how it started.”

 

Along with Lon Grohs, the Chief Commercial Officer, these men are the brains, not to mention the energy and vision behind the Chaos Group.

 

    

Peter Mitev, Vlado Koylazov, Lon Grohs

 

You may have read my diary/blog in April, which I wrote from FMX 2014, the VFX and CG industry conference that is staged every year in Stuttgart. (link below). In it you will have seen my mention of meeting Peter Mitev while I was there. I followed up on that meeting with Skype conversations with Lon and Vlado (Lon in LA, Vlado in Sofia, all of us online – I love living in the 21st Century!!).

 

Inspired by FMX, I wanted to learn more about the company and with luck add some technical flesh to my references in the blog to the company and its work. I started out in stop motion, but retrained in CG in the mid 1990s, when I saw the way the industry was going.

 

Disruptive Technology and the future of film making

 

FMX 2014 focused heavily on what is called Disruptive Technology. This means technology that is so radical it will disrupt the way we do things in the particular field it affects. I had my life’s major technical disruption when I switched from analogue animation techniques to computer generated imagery, and I suspect the coming revolution will equal that in its impact on careers in film making, but this time I believe it will touch on all forms and stages of production, in animation, VFX and live action, and in the course of these follow up articles I hope to illustrate that proposition.

 

In terms of rendering, the major leap is towards Realtime film quality rendering. Both Vlado and Lon, in their different ways, told me we are still some years away from that, but we agreed the gap is closing, and closing fast. I asked Lon first what he thought about this, and he gave me a comprehensive and enthusiastic answer.

 

“I’ve completely changed my perception of where we are with technology and the boundaries of technology. We are currently working on something we’re calling Massively Parallel Rendering. We’re in this world now where you can tap into several hundred thousand cores, whether it’s on the GPU or whether it’s on the CPU Cloud, or something like that, and get them all working towards the same goal. In some of the tests we’ve done you didn’t see rendering, you didn’t have buckets go across the screen, or see a bunch of pixels converge. All you saw was the image, the final, photo-realistic image.
 

 

“It’s an unforgettable event. Suddenly rendering is no longer rendering,” he said, referring to that long, tedious process anyone working in CGI will have experienced in the past. “It’s something transparent, that happens in the background. And of course, that’s what people have called the Holy Grail of rendering for the longest time. What happens when we’re able to do all this with incredible accuracy and reality? What happens when it becomes instantaneous? What happens next? That’s one area of research we’re really looking forward to.”

 

The prospect of Realtime Hi Res rendering beckons.

 

Vlado, it should be said, was rather more cautious than Lon. He liked the work being done in R & D in Los Angeles, but he and his team are the ones who have to write the code, line by line (and we’re talking about millions of lines of code in programmes at this level of sophistication).

 

“I still think the hardware is not quite powerful enough to give us photo real games in Realtime. Game engines are getting better and better and that’s true, and they can do more and more things, but if you compare even the best quality game engines today you still see things like shadow map, also they don't do proper multibounce reflections and refractions, they don't have accurate glossy reflections etc. So they look a lot better [than they did up until now], but I can still tell a game from a photograph, but I cannot tell a high quality CG render from a photograph, [but that] is not Realtime. There is still a gap and to be honest, I’m not sure it is going to be closed very soon. We’ll get better game engines, that’s for sure, and it will get faster. Photoreal rendering is also getting faster for sure, but I don’t know when the two are going to converge. I still think we are some years away from that.”

 

Years, but not Decades

 

Vlado is right, of course, this is something that will take years, but it will not take decades. In 1996 we were animating the skeleton without the geometry because the Poweranimator viewport couldn’t refresh the geometry quickly enough for scrub playback, rendering took an hour or more per frame for a single character (we were working on a TV series about dragons), and it took an hour just to close the file! That’s less than twenty years ago. It is not many years since Autodesk Maya and the other film quality softwares brought us full hardware render viewport scrubbing, and now here comes highend film resolution rendered frames in Realtime!

 

The Chaos Group have been working towards this goal, and will continue to develop V-Ray RT (Real Time) in conjunction with a number of partners eventually to make the dream a reality.

 

“We’ve been teaming up with a local company that’s just starting up here, called Nurulize. They made some press recently at the Tribeca Film Festival, because they did a demonstration of some technology they’ve been working on with the Oculus Rift [the headset virtual reality display]. They took a scene from a film and then they brought it to you as an incredibly highly realistic Virtual [360° scene], so that the user could actually walk around and engage with it, with a high degree of realism and accuracy. And that actually is a different application of V-Ray. They used it to pre-render out some lighting and things and then added that to the engine they’re putting it in.

 

“So combine what happens when rendering becomes transparent and then what happens when the experiences that we’re looking forward to having are completely transcendent of just pixels on a screen, and there’s pretty interesting stuff on the horizon, I think.

 

“The one common thread that I see where V-Ray fits into the mix, is, we think that V-Ray can bring rendering to the masses, to where it doesn’t have to be this highly technical and difficult experience. It can actually be something that is quite literally Drag & Drop, which is also exciting.”

 

I asked if this meant an end to pass rendering and compositing, that we are coming to a time when it is all happening in the background, as soon as you point the camera.

 

Lon still thinks offline rendering will be the staple methodology for some time to come, although “That’s essentially where I see this confluence of technology and hardware and software ending up. I think we’re only at the runway leading up to it, we’re not there yet. There still need to be some advances in computing power and what the software can do to take advantage of it, but certainly that’s something that we have our sights set on.”

 

V-Ray RT – Realtime high res rendering

 

When we were talking at FMX 2014, Peter told me Chaos Group are working towards a situation where the Director can look at his handheld monitor and see the actors in a virtual set, fully rendered and lit in Realtime. I asked Lon if this was what he was talking about. He told me about a film they have been making called “CONSTRUCT”, in which they have V-Ray RT running inside Autodesk’s MotionBuilder.

 

 

“We’ve been collaborating in our LA office with some local artists, led by director Kevin Margo, who is an absolutely amazing filmmaker and artist. In his day job Kevin works as a VFX/CG Supervisor at Blur Studios. He and a small group of talented artists, with our help and the help of NVIDIA, and a few other folks, have embarked on an experiment [to find a solution to the question] How can we render final frame images on the GPU? We’ve heard a lot about GPU rendering for a long time but we’re now actually at the point where the feature set and the memory efficiency of the GPU converge.

 

“What is interesting is that was where the experiment started, and that was actually the easiest part.”

 

I should mention here that Lon was quick to say that he didn’t mean that it was easy to do all the background artwork, and develop and design the characters, “but when it came to rendering it, it was literally like, Hit the button and wait five to ten minutes for an HD frame. And that was just on a few GPUs.”

 

“As the work progressed, Kevin wanted to expand on this idea of taking Raytracing and Path tracing [see Side Box] into the Motion Capture studio. His plan, his hope, was to take a virtual camera (in this case, Lon told me, they were using an Optitrack Insight VCS camera), hook it up to Autodesk Motionbuilder, and plug in V-Ray RT, so that what he saw was what he would see in final rendering, but in Realtime.

 

“We put together a prototype and a couple of our developers took the API which had just recently been opened up for Motionbuilder, and we used something which we call the V-Ray Application SDK, which is a sort of wrapper around V-Ray that allows it to plug into other applications, and within a few weeks time we actually had this thing up and running and working inside the Mocap volume. Instead of the pingpong ball [mocap marker] covered characters, what you actually saw was the robots that those actors would eventually be in the film, in the Virtual Reality environment, lit and raytraced in Realtime. It’s a little fuzzy, a little grainy right now, which is partially a by-product of it being a prototype and the hardware that we’re running it on, but even now, if you pause for a split second it converges incredibly fast to a realistic view.”
 

This is a short Making Of movie that Ken and the Chaos Group put together to demonstrate what they achieved. I think you will agree this is pretty amazing, and points towards a whole new era of film making.

 

 

Lon went on, “We think with some of the new things that are out, like the virtual computing cluster that NVIDIA has recently launched, that in conjunction with this, [we’re] approaching raytraced Realtime scenes in a live action [studio].”

 

 

This is something that FMX 2014 discussed over several panels and demonstrations, under the heading Virtual Production. I will write on this in more detail in another article, but in essence, film makers are looking at being able to pre-visualise the whole film at the earliest stages of the production. In fact, between the start and end of FMX, by the final day when the 5D Institute presented a whole day of discussions on the future of production methodology, the talk had moved from describing a linear virtual production process to envisaging what Alex McDowell of the 5D Institute called non-linear virtual production, in which all departments could gather around the virtual camera, each on their own iPad (or equivalent), and look at the scene, fully lit, with the actors in position, and with the designer able to make instant corrections in placing of the set, the director blocking the actors, and the DoP able to make a lighting list based on something he can see as opposed to trying to interpret the verbal descriptions the director and designer are giving him, as is the case up until now.

 

“The prototype just allows you to move around the volume and film it,” Lon told me. “The next stage will be to hook up that camera to the camera controls inside V-Ray, and then you’ll be able to control your shutter angle, your F-stop, and all of that. You will be able to control a virtual camera in exactly the same way you can control a regular one.

 

“Beyond that you will be able to virtually light this thing, get a grips to do motion tracked cards that would essentially be whatever you want, a spotlight, a Kino Flow, any kind of live action lighting which is Phase 2 or 3, we’ll say. That’s something we will continue to experiment with.”

 

I was reminded of that time in the mid-1990s, when the world seemed to be changing out of all comprehension. I was 50 when I learnt computer animation and launched myself into the world of CGI, and I am reminded now of something Douglas Adams, author of The Hitchhiker’s Guide To The Galaxy, once wrote:

 

“Everything that existed when you were born is Normal. Everything that comes into existence in the first forty years of your life is fabulous and exciting. Everything that comes along after you’re forty is a crime against nature.”

 

Well, if you’re over forty, get ready for some serious crimes against nature in the coming decades!!

 

With the kind of industry transformation that Lon is talking about, careers are going to end, and new careers are going to come into being.

 

Lon agreed. “Yeah, it’s interesting that you say that, because one of the things that we brought up in a recent round table discussion we had with Kevin Margo and Mike Romey from Zoic Studios was that there are probably Directors of Photography that see some of the changes in CG as taking away aspects of what they used to control and the reality is that now they are put into a whole new paradigm where they can take their expertise and their experience and apply it in a totally different and unique way. So we see that there may be some shifts, but we still think that there is every chance for those jobs to turn into opportunities.”

 

(Zoic Studios have done the virtual production on many shows, such as “Once Upon A Time” and “Falling Skies”)

 

The point being, if a DoP or Director is prepared to embrace the new technology, he or she is going to be working with the same visual experience, just delivered in a new way. They just have to get used to the idea that they can only see it in the monitor, they can’t see it in front of their eyes, through the eyepiece. It’s life, but not as we know it, Spock...

 

Phoenix FD – Chaos Group’s VFX Simulation software

 

I asked Vlado about Chaos Group’s other main product, Phoenix FD [Fluid Dynamics]. I asked, rather bluntly, if he thought it had a future as in my experience not many people use it.

 

“Right now it is actually starting to pick up a little bit. It has a long history. We did a plugin called Phoenix way back in 1998. It was not really based on fluid dynamics. We tried to cheat fire and smoke effects. It was our first project actually, V-Ray came after that. We decided that if Phoenix was going to be successful it needed to be an actual fluid dynamics simulator, so we did something called “Aura”. It was supposed to be a successor of Phoenix, but unfortunately we didn’t finish that project. So we had already a large amount of code, it was more or less working, but it didn’t go anywhere because we were focused on V-Ray at that time. So it stayed like that for a few years.

 

“And then we decided to reboot the whole thing. We got some very clever guys to work on it again. Basically we rewrote it from scratch, and as it turned out we decided to add not only fire and smoke simulation, but also water and splashes and foam, and all kinds of these interesting effects. It took a while for the whole thing to come together but it’s really starting to pick up right now, especially the parts about water and foam, with splashes and everything. We have very good ocean tools in Phoenix now. So, it’s moving along, it’s getting a little bit more popular these days.”
 

 

I asked what if any part Phoenix FD might play in Chaos Group’s pursuit of Realtime rendering.

 

“We’re not for the moment pushing Phoenix into the Realtime field. There is some research in that area but you have to cut a lot of corners to get to Realtime simulation of fluids, and water, and these kinds of effects. It’s still computationally expensive. The new hardware definitely helps to make the process faster, but for the Realtime thing you still have to simplify a lot of things.

 

“The calculation algorithms are not quite as precise and the interaction with other objects in the scenes is not quite as accurate. You take some shortcuts with the shading so that it sort of looks real but it’s not exactly the pure lighting simulation. It does the job, it looks fine but you wouldn’t really want to use this for a movie or anything like that. It works for games and for many other relatively simple applications, but if you want really high end, really realistic results we’re still not Realtime.”

 

Making Chaos Group products available to amateurs, professionals, anyone interested

 

I asked Lon about pricing structures, thinking of students and aspiring professionals, people who want to try the software but cannot afford professional licenses. Did he see these products getting more available, as in cheaper? I was thinking of that young couple, Miguel Ortega and Tran Ma, whom I also met at FMX 2014 in Stuttgart. They worked with Chaos Group in making their films.

 

Lon said, “I think that the lines of professional and hobbyist software have continued to change and adapt. We’re introducing some new licensing policies that help bring the costs down to give access to it. At the same time it also brought opportunities to be able to scale up to some of the larger institutions that have huge rendering farm demands and so on. But what’s interesting, and what’s intriguing to me is that I see the notion of rendering really just becoming rendering as a service. So that you’re tapping into someone else’s cloud hardware or whatever resources you need, whenever you need to get on demand rendering. The big studios and so on will continue to need a base line so that they can get through whatever projects they have and then maybe they’ll go to the cloud whenever it’s time to spike up or down.

 

“For folks like Miguel and Tran, as an example, they don’t have the need or the infrastructure to keep that around but they could just as easily send their files to the Amazon Cloud or use any one of those services, and basically be able to work on a Pay As You Go basis, and treat it a bit more like it’s a utility bill. This is going to democratise it in a huge way.

 

“I’ll give you an example. On a trip to New York, I met an artist, Alex Scollay, who was working, similar to Miguel and Tran, with his wife on an interactive children’s book. They worked together to develop do all the illustrations and the renderings and he had one machine in his apartment and he rendered to the Amazon Cloud for a total, I think he was telling me his Amazon rental costs were in the region of a total of 600 bucks [US dollars]. And that gave him everything he needed to do to render out the fully rendered iPad children’s book, which is phenomenal.”

 

The Amazon Cloud was news to me. I know there are render farms available who work online, but a publicly available Cloud service would be a fantastic asset.

 

A cloud service such as that provided by Amazon allows you buy time on the cloud. You can mirror your machine or if you are a company, the topology of your network and render directly to it. Lon described that as “a bare bones solution”. But he added, “There’s other answers out there which are cloud rendering providers, that essentially take care of the connection for you, and simplify that for you, so that it’s as easy as hitting a button and going to the cloud. Those solutions have an added cost for what they’re providing, but it’s still pretty effective.”

 

He cited an example in Robert Zemeckis’s film “Flight”. Apparently this was rendered by a single studio in San Francisco called Atomic Fiction. Lon told me they had only a small local render farm at their facility, and otherwise relied almost entirely on cloud rendering.

 

“Kevin Baillie, the VFX Supervisor there, walked me into the kitchen and he pointed to his render farm, which was literally a fibre cable that went out of the building. They rendered the entire film of “Flight” on the Amazon Cloud using a provider service.

 

“And what was interesting about that is that Kevin and Atomic Fiction built their company essentially to rely on the cloud. So rather than trying to buy a bunch of machines, or lease or rent a bunch of machines, to find space to create their own data centre, they said let’s look at this [idea of using the cloud],

 

“One very interesting thing that came out of that is that “Flight” was going to be 120 [VFX] shots when it started and it came out to be about 400 shots, so a little over three times what they were expecting, and under normal circumstances that just wouldn’t work, right? But, because they had already built their infrastructure to rely on the cloud, it meant that they could scale their render farm up or down as they needed. Instantaneously. So then the challenge for them was [the] need to find a group of talented artists to add to the mix, but [not] that problem of suddenly having to figure out how to render 300 more shots in the same amount of time.”

 

And so we go on into the future, and FMX 2015

 

I will be at FMX 2015, in’sh Allah, and I am looking forward to seeing what else this creative company will bring there. As the Chinese say, we’re living in interesting times. But they mean that in a somewhat cynical way, as in dangerous times of change. I see no danger, only ever more exciting developments as we make our way into the Computer Age of Entertainment Media.

 

As a final word, let’s go back to Peter and our conversation at FMX 2014. As we were finishing up our chat, he looked at me with an almost wistful expression on his face and smiled. “I love light,” he said.

 

 

 

Related Links

 

The Chaos Group

V-Ray Showreels

Chaos Phoenix

Zync Render

Blur Studio

Zoic Studios

Kevin Margo

Constructfilm.com

Construct on Facebook

Nurulize

NVIDIA

2014 FMX Report

Discuss this article on CGTalk

 


blog comments powered by Disqus