Fabric Engine 2 for MoCap/set dressing/interactive VR
has revealed details of tools it is building that allow VR producers to storyboard and edit content directly within the virtual environment, as well as capture that data and bring it back into an authoring application like Autodesk Maya.
Fabric Software CEO Paul Doyle said: “Virtual and Augmented Reality – and Computer Vision in general – teases us with an exciting future but as we speak to people that are trying to build high-end content for these platforms, we keep hearing the same problems – game engines are fantastic when it comes to playing back rich content, but they struggle when it comes to highly iterative authoring.”
A game engine is highly optimized to playback at the best fidelity possible, striking a balance between richness of content and frame rate. This means that data is crunched and massaged in order to perform at the best possible level but as as result, it is no longer in an editable state. If playback and experiencing the content is all that is required, this is not a problem but it is limiting when you want to edit that content and make changes to it. This is where Fabric Engine can help.
Fabric loads the data natively, meaning content creators can make changes and save those results back out. Fabric can also be extended to support pretty much any file type – it ships with support for Alembic and FBX, it works with USD and it includes the tools necessary for supporting proprietary data types, or data types from other industries.
Fabric can also be easily extended to support different libraries, allowing content creators to work with any hardware devices. In the prototype demos, Fabric is seen working with Oculus, XBox controllers, xsens motion capture and Hydra controllers – all accessible through the Canvas visual programming system that is part of Fabric Engine 2.
Doyle said: “The first use case we tried was storyboarding in VR. We wanted to make it possible for a content creator to load in a scene and then mark up that scene with a ‘3D stroke’ system. And most importantly, we wanted to capture that data and bring it back into an authoring application like Autodesk Maya. Then we started playing around with different ideas. Could we hook up a motion capture suit and use it with the strokes system? Could we build a set dressing system so you could walk a virtual set and make live edits to it? How about a complete VCS system? The answer was ‘yes’ to all of these questions and we’re pretty excited by the results!”