Mixed reality capture is the best way to show the interaction between humans and VR. It lets us showcase all the interaction and emotional impact that makes VR so compelling. When merging VR and real-world footage in real-time, it’s also a great way of broadcasting live.
The tools are improving quickly and advances across the industry to explore the technology suggest mixed reality is gaining momentum. Facebook, for example, used mixed reality to broadcast The Unspoken, a magic dueling game, live from the E3 show floor.
The demonstration was an intriguing test of the social media giant’s streaming technology. Meanwhile, HTC’s presence at E3 included shareable mixed reality. After they captured my time inside Space Pirate Trainer, a link to the footage popped up on a tablet nearby. I entered my email address and got a link to the video a few minutes later.
The developers of the creative VR game Fantastic Contraption are keeping their software at the forefront, popularizing mixed reality capture in the process. The game can now connect directly to a camera and composite footage within the app, making the set up process considerably easier. Plus, the game can now produce footage that shows virtual objects floating in your living room without a green screen.
Valve started supporting mixed reality features in the SteamVR Unity plug-in a couple months ago. Most developers who use the popular game creation toolset can add this feature. When a third controller is present in a Unity game running through StreamVR, the window for the game should pop into a 4-quadrant view used for compositing mixed reality. We recently used the approach to capture this Cyberpong gameplay:
While Unity developers have been able to include mixed reality in their games, developers using the other leading VR creation toolset, Unreal Engine 4, haven’t been so lucky. That is expected to change in the next update to UE4 from Epic Games. I contacted Nick Whiting, technical director of VR and AR at Epic, and asked him how the capture system will work.
Whiting wrote in an email it’ll be similar to current approaches and work the same way on Oculus, Vive and OSVR. Developers using UE4 will add a VR capture camera to their software that can be either stationary or “adjusted based on the location of the real world camera relative to the origin, or it can be attached to a MotionController component, if you’re using an extra Vive controller, or something else to track the camera’s movements. You’ll also be able to set the FOV, and resolution that you’ll want to render at.”
“Then, in game, all you do is switch into capture mode, either through the console, or through Blueprints, and the mirror window will output a foreground view, and a background view,” Whiting wrote. “You’ll composite that with the real-world video feed, and voila, you’ve got mixed reality!”
Eventually, Epic plans for UE4 to capture live video. When that happens, “we’ll do all that compositing for you, so the user will just toggle it on or off, and see the video feed in the game.” The UE4 feature may change in the coming weeks as Epic begins testing. One thing Whiting notes about the Unreal software is that it will efficiently draw the foreground and background used for creating mixed reality, meaning people will “get much higher resolution captures on lower end equipment.”
That direction UE4 is headed sounds similar to the latest features in Fantastic Contraption as well as the approach being pursued by Roadhouse Interactive, a Vancouver-based developer working on a Unity plug-in that should offer similar features. Roadhouse Interactive is working with a few Vive developers to finalize the features and functionality of its mixed reality plug-in. The company expects to release in a matter of weeks and we’ll follow up when it does.
“Valve did some great work with their built-in solution in SteamVR, but it isn’t as seamless as people need and these barriers are preventing developers from getting Mixed Reality solutions in their games,” wrote Kayla Kinnunen, Director of VR at Roadhouse Interactive. “We are focused on building a plug-in who’s purpose is to create the easiest path to user-friendly mixed reality for developers. To us this means optimizing the pipeline for fixed position webcams with built-in compositing. We are also looking to expand the feature set once we have the initial plug-in streamlined for developers to integrate and end-users to use.”