We saw a lot of inside-out solutions at CES last week, but it wasn’t until today that we could talk about one of the most impressive offerings from Eonite.
This computer vision company, based in Palo Alto, California, is working on its own Vantage Head Tracker system that brings positional tracking to any VR headset without the need for external cameras and sensors monitoring you in a pre-defined environment. But this doesn’t mean a new headset like the standalone reference designs from Qualcomm and Intel, or devices like Oculus’ Santa Cruz and Microsoft’s HoloLens. In fact, Eonite isn’t focused on hardware at all; it’s all to do with software.
The company’s solution delivers 6 degrees of freedom (6DOF) movement with what it claims is sub-millimeter accuracy and less than 15 milliseconds of latency. It also includes real-time obstacle detection and even supports multiple room mapping. Eonite’s aim is to offer this software to companies making their own inside-out headsets, and it’s making it available from today. Interest is apparently “tremendous”.
“Eonite started a year ago to productize and monetize the technology that was being built over the prior four years by the technology founders.” Youssri Helmy, CEO and founder of Eonite told UploadVR. Those founders are Dr. Anna Petrovskaya and Peter Varvak, who invented a new algorithm to “extract super-high accuracy from commodity sensors.” Rather than partnering with a specific company, Eonite wants to democratize inside-out tracking and make it widely available for hardware makers, and not just companies like Microsoft and Oculus with their own custom solutions. The company chose to go after positional tracking for VR and AR headsets first, but “it could be adapted to robots and drones”.
I tried the software out for myself at CES, with a depth sensor crudely taped to an HTC Vive that had its usual trackers covered up. In the demo, I was able to walk around a living room not too dissimilar to the hotel suite I was really standing in, with small drones buzzing around my head. Simply looking at the drones would blow them up, which allowed me to test the latency. The system was very impressive for what I’ve seen from inside-out tracking kits so far, though wasn’t as solid as being tracked with Vive’s usual lighthouse sensors. Walking around went back and forth from feeling smooth to battling screen judders, while that 15ms of latency produced an ever so slightly noticeable delay in head movements.
That said, the system was reliable for the most part. I could take down drones without feeling like I was battling latency, and I moved around the room without challenge. Eonite’s solution isn’t really meant to match SteamVR. “You can obtain higher accuracy if you have unlimited budget on compute and sensor,” Helmy said. “We use 3D commodity depth sensors and commodity IMUs [inertial measurement units], and the amount of signal we extract from that is our claim to fame.”
Though impressive, the demo obviously wasn’t without its flaws. At several points I seemed to jump from one side of the room to the other before finding myself back in my original spot, which Eonite attributed to the undisclosed third-party camera it was using for the purpose of the demo. Tracking would also change from 6DOF to 3DOF if I stared at the floor, though the team assured me this would not be the case for the full release.
What’s available today is intended for tethered headsets, though support for mobile devices is on the way and, ultimately, far more important. “Inside-out tracking offers more possibilities than outside-in,” Helmy said. “It offers a lot more assets to deal with from a developer perspective.”
Product partnership announcements are set to be announced in the coming months — the system “should be in a tethered product in the next two quarters” — which may well mean we could see new inside-out tracked products in the near future. As for mobile, Helmy said the first wireless headset to integrate it will probably be available “early next year”.