Last week I had the chance to interview OTOY’s CEO Jules Urbach about their recent announcements and future roadmap. We talked Unity, open standards, the metaverse, and more, but the general thread pointed to OTOY as one of the pivotal companies in the VR space. Ignoring OTOY would be like ignoring Youtube in the early days of the Internet.
What is OTOY?
At its core, OTOY is essentially a 3D format company. Their mission is to make realistic and distributable 3D graphics. You might have heard of the .OBJ and .FBX file formats, currently the most popular for 3D content. If not, you’ve definitely heard of .JPEG, one of the internet’s most widely used 2D image formats. In March of 2014, OTOY released a brand new 3D format called .ORBX, and it’s been growing ever since.
ORBX is a fundamental building block of the metaverse. Unlike OBJ and FBX, which are limited to geometry, it contains all the details of a high quality 3D scene such as lighting, physics, textures, and more. Whether streaming video or VR, it promises to be a one stop, open source solution to distributing anything the human mind can imagine over the Internet.
“ORBX supports detail down to a quarter the size of a Hydrogen atom,” according to OTOY.
The Big Unity Announcement
At Unity 3D’s conference in L.A. last week, OTOY announced their technology would be natively integrated into the Unity engine. Since Unity is the world’s most popular game engine with around 45 percent market share, this graduates ORBX from a power-user niche to a widely accessible creation format. When OTOY’s integration releases next year, you’ll be able to import and export ORBX directly from Unity, and use OTOY’s Octane Renderer to create experiences with extremely realistic visual detail. At launch, this means you’ll be able create and edit photorealistic scenes directly in Unity, then export a 360 video in the streamable ORBX format. However, it won’t be long before you can move through a scene as positionally tracked lightfields become supported (more on that later).
Hands on Demo
During the interview I had a chance to try a demo, and it was reminiscent of my experience trying the Oculus (DK2) for the first time. The first thing they showed me was a 360 image from Keloid, a sci-fi short film created using their technology. It literally brought tears to my eyes. All I was seeing was a 360 image in a GearVR, yet as someone heavily entrenched in VR I was blown away. Why? The image looked entirely photo-real, but it was entirely constructed via computer. It was beautiful.
A 360 image is one thing, but how about video? The next demo shows a scene of a park in the middle of a city. At first I think it was another image, then I look at a puddle and notice the reflection of the tree branches swaying slightly. Suddenly my brain clicks into full presence mode, and I am there in that park feeling the chill of a wet morning after a night of rain. Depending on where my head is rotated, the light and reflections from the puddles of water around me change. Taking off the headset, I ask what camera system they used.
“None,” Urbach said. “It’s all 3D rendered.”
My mouth hangs open. I believe him, but my mind screams it’s impossible.
It Gets Even Better
I ask Urbach when this level of visual quality is going to come to headsets with positional tracking. He replies by saying they already have a demo on the Vive they showed at CES this year. To illustrate how it works he shows me a short video:
Essentially, they pre-render every possible viewing angle within a box of space, allowing you to move freely as long as your head stays within their “Lightfield Cube”. As you can imagine, the file size for something like that would be enormous compared to traditional media. However, this is where OTOY’s elegance and technical superiority truly shines.
ORBX is highly optimized for streaming and use of the GPU. Even with current LTE Internet speeds, OTOY is able to stream lightfield media to a VR headset. Since all the computation and heavy lifting is handled in the cloud, the end-user hardware doesn’t need large storage or a massively powerful GPU.
Roadmap to ‘The Matrix’
Right now, OTOY’s focus is on mobile VR and 2D 360 videos. This strategy makes sense since desktop VR is still a small market. However, I was told that next year OTOY will be focusing on rolling out positionally tracked light fields for all the major headsets (PSVR, Vive, Oculus). Combined with their finished Unity integration, we should expect to see a number of amazing photo-realistic VR experiences on both desktop and mobile.
One of the key things they are working on to ensure this, and fix the problem of VR content distribution, is to get WebVR content streaming with the push of a button. So, rather than the current pipeline of having to download a file or wait a long time for browser content to load, ORBX players can almost instantly start streaming VR content and load the rest as it plays. Since ORBX is an open source format, any website can make use of it, allowing for consumers to hop from VR experience to VR experience through the open navigation of the web, rather than a single app store.
As mobile headsets with inside-out positional tracking become available, OTOY’s technology will immediately be ready to bring photorealistic position tracked content to the marketplace. So if estimates about when to expect mobile VR inside-out tracking are correct, in just a few years we could have cordless photorealistic VR.
Ten years from now, when VR/AR is an integral part of everyone’s lives, we’ll be able to look at the amazing quality of visuals and accelerated rise of 3D computing and thank OTOY for helping making it all possible.