The latest update to the Oculus Integration for Unity adds the ability to use Quest’s controller-free hand tracking in the editor.
Facebook added experimental controller-free hand tracking for Quest in December. It lets you use your hands in the open air to interact with VR content rather than through Touch controllers. This is done through advanced computer vision algorithms powered by machine learning.
Since the release of Oculus Link beta in November, developers building apps for Quest have been able to use a high quality USB 3.0 cable to instantly iterate on changes made in Unity, the game engine used for most VR apps.
However, the Oculus Rift SDK doesn’t support Hand Tracking, so developers building apps with this feature have had to compile builds and send them to the Quest headset each time they make a change to their code. This could take anywhere from 10 seconds to a few minutes each time depending on the scale of the project.
With the latest version of the Unity Integration, this is no longer a problem. Link can pass through hand tracking to the Unity editor, so simply pressing ‘Play’ will let devs rapidly iterate on hand tracking interactions. Interestingly, Quest still doesn’t pass the microphone through Link, which would seem to be an easier task.
This of course begs the question: if Hand Tracking can work via Link in the Unity Editor, why couldn’t it work in an actual PC VR app? The answer is that since the Oculus Rift SDK doesn’t support it, there’s no way to add this. Facebook has been relatively vague about whether Rift S will get the same feature. In fact, the Oculus Quest store still doesn’t accept apps which support Hand Tracking, but Facebook’s statements indicate that should change some time soon.