Exclusive: OTOY Is Bringing Lightfield Baking And More To Its Unity Integration

by Jamie Feltham • January 11th, 2017

The road to photorealistic VR is long, but OTOY is helping to shorten it with the integration of its technology into the free version of Unity.

The company announced last month that its Octane Renderer would be coming to the world’s most popular game development engine, allowing users to import its ORBX file format to view them within the engine. It’s a huge step forward for creating visually believable VR environments; OTOY is employing lightfield technology to create scenes that are practically indistinguishable from real life, and the Octane Renderer gives everything a cinematic look. You could import a simple item from the Unity asset store, apply the renderer, and have it look strikingly realistic. ORBX files are also optimized for streaming to take much of the processing load off of the machine running them.

What that means is we could soon start seeing photorealistic VR worlds on hardware that’s not even remotely powerful enough to support this sort of graphical fidelity on its own processors. Today, we have an exclusive look at more features OTOY’s Unity integration will enable.

CEO Jules Urbach runs us through them in the video embedded on this post. He talks about the importance of scene baking, including the advances they’ve made in using the process but still offering 6 degrees of freedom within a scene, as well as adding real time dynamic elements. These are quite complex components, so we’ll let him do the talking, but you can look forward to these features and others when Unity support is introduced at an undisclosed time later this year.

Speaking to UploadVR, Urbach revealed that the company will go into more detail on many of these features at GDC next month. He said the company will be “showing the full spectrum of our VR and AR/MR content pipeline for artists and developers for the first time. This covers Octane scene-baking at the simplest level, to more advanced light field rendering and streaming powered by Octane integrated tools and streamed to a full 6DOF MR stack on ODG’s R8 and R9 glasses.

“We will also be doing a deep-dive presentation with PowerVR on our initial results leveraging next generation of ray tracing hardware in Octane 4.”

OTOY will also be giving hands-on demos and walkthroughs of both Octane and ORBX “for generating light fields and streaming VR/MR applets, as shown in the video. These can be published to public URLs and played back on the Gear VR through Oculus Social framework, as well as additional mobile HMDs devices through the Samsung Internet WebVR browser.”

Tagged with: , , , , ,

  • AcroYogi

    Unity’s light baking tech has been broken for quite some time for scenes of any level of serious complexity. Pro users have resorted to Mental Ray and other 3rd party external solutions. An in-engine integration that works seamlessly could be a quantum bump to the visual quality of Unity-developed product. Bring it!

    • Jules Urbach

      Thanks! GPU baking in Octane is really fast and super high quality. It will be available for free in Unity 2017 with Octane. We built a C# script to test progressive scene baking in Unity + Octane last year, and it worked great. For the final release, we are aiming to hook Octane baking into Unity’s new progressive light mapping system that was introduced at Unite 16.

      • Gerard Slee

        I am so excited for this integration! Good Job! Is there a way to help.

        • Jules Urbach

          Grab Unity with Octane/ORBX when it is in beta and provide feedback so we can make it even better for final release 🙂

      • Dotcommer

        Any plans for Daydream? I realize a lot of Daydream apps are probably built using Unity, but it sounds like you’re tailoring the functionality to Samsung devices.

        • Jules Urbach

          Daydream/Tango is next along with ODG AR/MR. All our Android platforms will be covered by GDC17.

      • Chris

        Will we get this plugin for the Unreal Engine?

    • Robbie Cartwright

      Totally agree, the stuff they showed off in that video looked incredibly realistic! So excited for this!

  • Awesome! Keep going! OCTANE RENDER 4 WOW Amazing @julesurbach:disqus

  • Zerofool

    Slightly offtopic, but I was really excited about real-life light field capture since the 2015 OTOY demo video, and I’ve been waiting all this time to be able to experience these holograms (both RL captures and synthetic) on my desktop VR setup. It seems, however, that the stand-alone desktop ORBX player (supporting Vive and Rift CV1) is still not released (to my knowledge), so I’m really looking forward to GDC where I hope we’ll get some news on that front, although it would be great if Jules can provide some details beforehand. It would really be a shame if this tech remains exclusive to mobile VR where the bigger part of the audience wouldn’t even be able to appreciate how game-changing this is over traditional 360 video.

    • Jules Urbach

      I just posted a detailed reply, but looks like it won’t show up it if I include links.

      Briefly, we are absolutely committed to PC VR, but we had to put all ORBX Media Player dev resources on mobile the past 15 months to support native ORBX playback integration in key partner apps and platforms: Oculus Social, Samsung Internet and ODG.

      The PC player is on track for release after Android, Tango and Daydream is done. It will be automatically added to the ORBX Media Server PC tray widget (you can get this now from our site).

      ORBX Media Server on PC is currently used for one- click streaming of desktop, games, apps and Unity, UE4, Chromium virtual sessions to Gear VR on LAN, and Oculus Social rooms over WAN, The PC server also includes our light house tracker for GVR (and other mobile HMDs) which is how we did the 6DOF Gear VR light field test shown a couple of months back.

      The full PC stack when finalized will support direct playback of ORBX content from a URL via a Windows app or web page, just like on mobile. At this point, we will also be able to re-re-stream this ORBX PC content to multiple independent local mobile HMDs over LAN to reduce storage and compute requirements where it makes sense.
      For live streaming, ORBX content can already be published to the cloud and streamed back in any web page or ORBX native viewport. You can test this with the ORBX JavaScript client on our home page (under cloud demos).

      • Altares

        What about light-field acquisition? You replied to me last time (6 months ago?) that you were in talk with 3rd parties to put it to market but I really really need to see and experiment with this.. Can’t it be done with standard DSLR rigs like the one you demoed in the video? I already am very well equipped and I wish you could release the software somehow so that techie people can begin playing with it.

        Also, about light-field volume playing on PC VR. Could you please make available (when ready of course) some very big files for people to do demos without having Internet access? There is the Torrent method which works extremely well for very big files. There is no better advertiser for your technology than PC VR enthusiasts. We do VR demos all the time and, I assure you, many many people out there don’t even understand what is VR until they put on the mask (oops.. that’s from a movie).

        Thank you for making VR great again ! (and that’s from… 🤢)

        • Surykaty

          I have a few CNC machines and so would love to build my own jig too.. wish they would release software that would convert the acquisition made by the DSLR in the jig into a viewable lightfield.

          • Jules Urbach

            I was just at MPEG 117 yesterday, working with JPEG Pleno and MPEG WF on light field standards. I plan to submit the light field capture we did of our office as a dataset.

          • Altares

            Thank you! Looking forward to it!

      • Zerofool

        Jules, thank you for taking the time to write this lengthy reply, I really appreciate it. I’m anxiously looking forward to this 🙂
        Like @Altares, I’d also like the option for offline local “playback” of the light field scenes on the PC, but I guess running the server app and the client/player app on the same machine should achieve the same thing.
        I thank you and your team for the innovations you bring to the industry and for setting the bar higher for everyone.

        • Jules Urbach

          The LF viewer is a module for ORBX Media Server, which can be downloaded today (system tray app for windows, currently used for desktop and virtual chromium/UE4 re-streaming over LAN). When we release .ORBX files with LF content the PC stack could load them locally. The frames are still bigger than SCM 18K video, so foveated 6DOF streaming makes more sense (and we are turning that on for remote ORBX media files when bandwidth is < 15 Mbps).

          • Zerofool

            Thanks for elaborating! Now I get the picture more clearly.
            I’ll be eagerly awaiting for the completion of the PC stack and the release of the .ORBX files with LF content.

      • Joel Douek

        Very cool news! We’d been in discussions with OTOY to create some next-gen spatial audio experiences with lightfield imagery (“Still Life With Sound”) & this will make our audio integration finally possible & easy!! Hope to re-connect at GDC.

  • MrGreen72

    Awesome Goldorak shirt Jules! Mad nerd cred there! 😉

    • Jules Urbach

      <3

  • Anthony Alan Phares

    Extremely excited for this tech for a while now, still one burning question, where’s the batman? 🙂

    • Jules Urbach

      It was released in December for Mattel Viewmaster!

      • Anthony Alan Phares

        Okay, crossing my fingers more content for the Gear Vr will become available

        • Jules Urbach

          I am working on it.

  • Walextheone

    This is the one software technology I’ve been waiting for! Really excited to see this bad boy in action =)

  • Wow, as a Unity developer I find this news super-interesting!! And considering that surely they’ll add some demo scenes and assets to showcase it, we’ll have lots of material to work on, too!

    • Jules Urbach

      We’ll seed some samples, but it is really easy to pick up. Lucas had only spent a half day with the alpha build before we showed is live at Unite16 in Nov.

  • rcheezumVis

    Jules, very excited to see this collaboration. I remembered seeing your talk and chatting with you at GTC regarding the future of Brigade/Octane and integration with game engines. Looking forward to that holodeck feature some day!

    • Jules Urbach

      Me too!

  • Surykaty

    The thing that totally blows my mind is the Imagination tech raytracing accelerator stuff. I wish they would provide more information about their future plans if a PCI-E raytracing card comes soon, if it scales well watt by watt from their “2W mobile chip” to the “120W desktop class chip”.

    • RSIlluminator

      That’s what I’m curious about as well. They did have the Caustic Visualizer plugin and card at one point, but that was retired. I’d like to see more tech like this for people who make content.

  • Behram Patel

    RAD ! Unleashing all hell in the Gaming world.I like.
    b

  • morphingrangers

    i dont get it….