Oculus Engineers Explore New SDKs at Oculus Connect

by Matthias McCoy-Thompson • October 1st, 2015

At its core Oculus Connect is fundamentally a developer conference. Aside from all the incredible announcements, demos, and knowledge bombs from John Carmack, most of the conference is about developers learning from each other. Whether it’s in lectures from top Oculus engineers or conversations in the Developer Lounge, there is a tremendous amount to learn at Connect about how to develop incredible VR experiences.

Of course, the key to all Oculus development starts with their SDKs. In the past month, Oculus released new mobile and PC SDKs with a host of new features and fixes. At Connect, they went over the specifics of both SDKs and how they work, including info on the Touch SDK. Even more tantalizing was the announcement during the keynote that the next PC SDK is the final one, aptly named 1.0.

Mobile SDK


In the talk on “Developing with the Oculus Mobile SDK” Johannes van Waveren, the Engineering Manager at Oculus, dove deep into how the Mobile SDK works. He went into the specifics of how to use asynchronous timewarp (ATW) to deliver a consistent frame rate, how multi-view is able to render two images at the same time, and how advance simulation limits the time ATW can display images.

One of the more interesting aspects of the talk was on how to detect problems in development. He explained that using the code “adb logcat -s VrApi” will display the following indicators:


Using this information, developers can optimize and correct any problems with their demo. In particular, he noted how extralatency mode can be used if there is late frame completion. This allows twice as long to complete every frame. While this may increase latency, it gives the GPU more time to render.

He also gave a brief roadmap of developments that are coming up. There will be a number of new environments and APIs supported including Android Studio, Gradle, and Vulkan. They are going to include multi-view support and a multi-pass ATW compositor. That ATW compositing will also be exposed to Unity and Unreal, allowing for more flexible development. Oculus also plans on simplifying the native app framework life-cycle. However, if you’re still using Unity 4, support will be phased out. Overall, the Mobile SDK continues to improve and offer a host of tools for developers to maximize the performance of a phone for VR experiences.



Courtesy of Oculus

Anuj Gosalia, Engineering Director at Oculus, led the talk on “Building for the Rift with the Oculus PC SDK” which covered everything from the design of the API, how the software stack works, and how to design apps that show the true capabilities of the Rift and Touch.  

The latest Rift SDK 0.7 has a number of improvements over the previous SDKs. The most important is Direct Driver Mode. As Brendan Iribe pointed out during his keynote, everyone suffered during the era of Extended Mode. Now, the software should be able to detect a Rift and automatically send the experience both to the headset and the monitor.

There are also a number of technical fixes. It now uses sRGB-rendering so that the GPU no longer has to make assumptions about coloration. It also supports Windows 10 with Direct Driver Mode, although only 64-bit versions of Windows will work with the new SDK. Finally, all sorts of fixes were added for resetting the location of the headset, reducing undesirable spin, and supporting future HMDs.

The SDK also includes support for Oculus Touch. The SDK can return the position and state for two Touch controllers. A key thing to note is that the controller hand pose data is separated from the input state because they come from different systems. Therefore, they are both reported at different times. Instead, the hand data is reported at the same time as the headset data, ensuring that there is a consistency between the view of the hands and their position.

The 1.0 SDK to Rule Them All

There wasn’t a lot of information reported on Oculus SDK 1.0 besides the announcement that it will be coming in December and it will be the final SDK before the consumer release. Still, it’s crazy to think that we’re getting close to the final SDK. Over ten different versions, starting with 0.1.3 way back in March 2013, Oculus has slowly been improving the way its software interacts with its VR headsets.

It’s been a marathon for both Oculus and VR developers up to this point. We are now reaching the finish line and I’m sure all VR developers will rejoice to know that SDK updates won’t continue to break their applications. But more exciting than that, is that the announcement means VR is almost ready for mass consumer adoption. Developers have had to put up with all sorts of bugs and issues that the average consumer would never put up with. Thanks to their patience and the hard work of many engineers at Oculus, the SDK that will define the future of computing is almost ready.

Tagged with: