Yesterday we wrote about OTOY’s new light field scanning system. Today we’re announcing you’ll be able to buy that rig later this year, unofficial price of “under $10,000.” And yes, the images are stunning.
Recap: the rig (pictured below) takes two Canon DSLRs and spins them around in a circle, capturing a dense light field image. [UPDATE: only one DSLR is actually capturing images, the other one is literal dead weight. Mad props to Marko Vukovic for bringing this to our attention.] The resulting data, when properly processed, can be rendered with OTOY’s pipeline and delivered to a VR headset with full head tracking. It’s a static scene but OTOY have alluded to future video capture.
Yesterday I got to check out that scene captured from OTOY’s office, as well as a purely CG light field. And yeah, it’s pretty amazing. In fact, it makes the experience of looking around a room … almost trippy. But to understand why, you have to really grasp what’s going on with light fields. Intuitively, we think of vision in VR as rays projecting out from the user’s perspective — sort of like ray tracing, or the ancient Greek belief that our eyes ‘project’ light outwards to strike the objects we’re looking at. Helpful when you’re thinking about how to set up camera arrays or render engines maybe, but useless when discussing light fields.
A captured (or rendered) light field does not try to represent the properties of the subject matter. Instead, it recreates the light emitted and reflected from a slice of space, and does not reference the underlying objects at all. Instead of a scene with objects in it, you have surfaces and volumes of light. The most accurate mental model I can conjure is an enclosed volume of light, a magical box or 3D window.
If you’re outside the box, looking back at it, the outline of the box to your perspective is the outline of a window into another world — the world where the light field was captured or rendered from. As you shift your head, the perspective moves naturally, but the outline of the box shifts and changes not like a regular window, but with the outline of the volume of the box. It’s like looking through a wormhole.
The real magic happens inside the box, though. If you move your head forward so the box encloses your entire head, suddenly you’re in the middle of the light field. The display can now deliver any perspective on the fly, so you can look around in true stereo 3D, in 360 degrees. When you turn your head or move around, everything behaves correctly: the objects parallax (even vertically), there is no ocular disparity if you tilt your head sideways, and the reflections on their stainless steel fridge shimmer as you move. The effect is profound, though I’d love to see it in a higher resolution display. At this point, the DK2 is practically legacy hardware, and it was clear the screen resolution was the limiting factor. I’d give my eye teeth for that demo in a Vive or CV1.
To answer a few specific questions we’ve gotten: the total area you can move around in is the area ‘scanned’ by the rig (or rendered in the engine). It’s enough that you can sit in a chair and move your head around comfortably. If you try to break it, you can — lean far enough forward and your head pokes out of the light field sphere, and you see black. If you’re behind the light field volume, your field of view is reduced to the perimeter of the wormhole. Neither light field I saw had any objects contained within the volume, though some of OTOY’s demos suggest you can have rendered objects display correctly inside the light field volume. Objects outside the volume are properly represented with full stereo ’roundness,’ occlusion, and surface reflection. You can ‘stack’ light fields, so if you wanted a Vive-compatible, full room light field, you could do multiple scans or renders and stack them to an arbitrarily large volume. The demo I saw was just one volume (the scan) or a few stacked (the render). Personally, I thought it was kind of cool that there was an edge; I spent more time playing around with the perspective at the edge of the wormhole than I did with my head inside the light field volume.
Now capturing light fields is still going to be tricky, and as impressive as the scan was, it wasn’t quite as awesome as the rendered light field. It wasn’t totally complete, so even inside the box there were black borders at the ceiling and floor. And the head tracking on the DK2 got in the way a little bit, as the image jumped and glitched if I turned my head too far to the side. I don’t know if this explains the difference, but OTOY CEO Jules Urbach pointed out that the captured light field is spherical, whereas they prefer to render their light fields in cubes. (You can stack cubes, after all, and sticking multiple light field renders or captures together extends the area you can move around in.)
Overall the scan was an impressive demo, but I’m glad I saw the rendered light field second. That one was downright magical; I spent quite a while ducking and looking up and through the volume of light, thrilled by this ‘object’ that I could look through. It was a little bit sharper than the scan, and the color felt like it popped a little bit more. (OTOY intentionally didn’t get the most out of the captured images; Urbach said they had HDR data but didn’t need it for the demo. In the future they could use the entire HDR image and throw a curve on it for a more dynamic image, though.) The image was very slightly cleaner than the scan, which had the tiniest artifacts in places like the windows. But that’s just some compulsory nitpicking, frankly. Both images (experiences?) were mind blowing.
So start saving up.
If someone had a really reliable way to capture light field video — and it’s hard to handicap how far away from that we are — I’d be willing to call it, and say non-head-tracking VR video was dead. But for the moment, it’s still images only. The rig itself is all off-the-shelf parts (plus OTOY’s magic pixie dust), and Urbach seems committed to getting this thing in consumers’ hands. So start saving up, and later this year you could be scanning your own office or living room. Or think big, maybe scan some wonder of the world or a beautiful landscape. I could’ve spent an hour staring at the floor of OTOY’s office through some fuzzy goggles, so I can only imagine what the rest of the world must look like.