Developers, many who have squandered their youth cloistered away playing computer games, don't have enough life experience to be able to figure out an application that would be meaningful to people who aren't enamored with technology for technology's sake. OK, so maybe we've all spent a bit too much time with a game controller in our hands - but the point still stands, much like riding a roller-coaster, the thrill of VR is ephemeral.
The Oculus Rift head-mounted display isn't about Virtual Reality, that's been around since Jaron Lanier coined the phrase back in the early 80's, Oculus simply figured out how to clock the IMU faster, then Facebook stepped in and acquired them for $2 billion because they understand where this is all going. Motion-to-photons lag is the delay between inputting a command and seeing the effects of that command. For Oculus, that magic number is somewhere under 20 milliseconds and even though you can't perceive it, for the very first time, VR didn’t make you puke.
In an attempt to create a virtual environment for pre-visualizing and developing content, we invited the CTO of UNITY (Joachim Ante), the head of the Autodesk MAYA dev team (Bruno Sargent) and the smartest AI guy in the Gaming industry (Dr. Bill Klein) to hang out at the Culver City offices for a few days and help us work on a wild idea. Amazingly they all showed up and the result was a quasi-real-time, VR content generator.
A few more months of coding and tweaking and we finally had an easy-to-use system that combined the power of MAYA (the motion picture industry's most powerful 3D application), with the real-time performance of UNITY (the game industry's most powerful Game Engine) and it was populated with the smartest CG characters in the world. Cool! The first thing we did (after getting it patented), was use in on a couple movie projects around town as a prototyping tool. It was a big hit and everyone could see that the potential was really unlimited. (see video below) The basic concept is that you pick some generic characters, dress them, select their AI persona and then drop them into a set. You can build a set if you like, but since the system is game engine-based, just about any environment that you can think of is already built. Then all you do is drag-and-drop the dialog, (either slugs from the script or pre-recorded audio clips), give your characters some rudimentary blocking instructions and hit play.
And here's the really cool bit: the cameras all have AI scripts too, so they know when and where to look. The system will automatically set master shot sequences according to basic movie industry customs (Wide Establishment > Master > Tight Reaction > Two Shot > Reverse etc.). You can watch (Lurk) conventionally on any 2D monitor or headset in VR mode - or - you can take the POV of any of the characters in the scene.
It's plug and play with both Oculus and UNITY so you can have numerous people, all with different vantage points, viewing, and even participating in the same scene. You can "possess" a character (we all agree that we need a better term for this but this is the first thing that came to mind and it seems to have stuck) that is driven by AI protocol and actually become that character. If the character has dialogue or action in a scene you could even over-ride the script and "improvise". Since the major characters are AI driven, they will be able to adapt and gently guide you back to the central theme of the scene.
Directors, DPs, Producers, Actors, Artists - everyone that has used it has fallen in love with it.
So here then is a small taste of our VCG (Virtual Content Generator), or as our friends in production like to call it, "The Prototyper".