Skip to content

Valve, SMI and Tobii Preview VR's Eye Tracking Future In HTC Vive

Valve, SMI and Tobii Preview VR's Eye Tracking Future In HTC Vive

Over the last week we learned that by spending essentially $300 to purchase three Vive Trackers, you will be able to bring your legs, feet and torso into VR — so you can kick a dinosaur in the face without even looking at it. Dinosaur kicking for $300 is certainly funny, but it’s also a great example of a broad effort by developers and hardware manufacturers to make virtual worlds more responsive to human behavior. Another is more robust hand and finger tracking, so the incredible variety of quick and precise movements in your hands are accurately represented in a virtual world. Still another example is eye tracking, and we’ve seen demonstrations from both Tobii and SMI in the HTC Vive offering a glimpse of how much better future VR systems will be at understanding our behavior.

tobii-inside-eyebox
A look inside a headset with eye tracking from Tobii.

New Tools For Game Designers

After a few minutes using the tech from SMI and Tobii, I noticed I was starting to unlearn a behavior I’d grown accustomed to in first-generation VR. Namely, I’ve gotten in the habit of pointing my head directly at objects to interact with them. That’s because current VR systems only understand where your head is pointed. Some games, particularly those on mobile VR, use this “gaze detection” as the primary method of interacting with the world. Tobii, in contrast, offered a very interesting test where I tried to throw a rock at a bottle in the distance. My aim was so-so on the first few throws, but that was without eye-tracking. When eye-tracking was turned on, they asked me to pick up a glowing orb and throw that instead. This time, almost every throw collided with a bottle.

Initially, I couldn’t understand why I’d want the computer to help me so much. As long as I kept my eye on the bottle and made a decently strong throw I’d hit my target every time. The glowing orb could be recalled by pressing a button on the controller too, so I could throw the ball and the instant it collided with a bottle I could recall it back to my hand like Thor’s hammer. It was just a simple tech demo but once my brain started getting accustomed to this new capability, I made a game out of seeing how quickly I could eliminate all the bottles by throwing the orb, recalling it the moment it collided, locking eyes on the next target and then immediately throwing it again.

This is what it took for me to realize just how empowering eye tracking will be for VR software designers. The additional information it provides will allow creators to make games that are fundamentally different from the current generation. With the example of throwing that orb, it was like I had been suddenly handed a superpower and I naturally started using it as such — because it was fun. It is up to designers to figure out how much skill will be involved in achieving a particular task when the game knows exactly what you’re interested in at any given moment.

tobii-eye-tracking
This is a screen grab from Tobii’s demo showing my eye movements over ten seconds. The purple lines represent what caught my eye in that virtual world over that length of time. This type of data is already used to optimize video game design.

Higher Resolution Headsets May Need Eye Tracking

Eye tracking will be useful for other purposes too, including foveated rendering and social VR. Foveated rendering focuses the most detail in the center of your vision where your eyes are actually pointed. Your eyes see less detail in the periphery, so if the computer knows exactly where your eyes are pointed it dials up the amount of detail in the right spot while saving resources in places you’ll never notice. As manufacturers look at putting higher resolution displays in VR headsets, eye tracking that enables foveated rendering may become fundamental to that effort because it could help keep computers at affordable prices despite pushing more pixels.

Make Eye Contact

https://www.youtube.com/watch?v=uJgQLF-rO7g

Eye tracking also dramatically increases the expressiveness in communication. In Valve’s booth at GDC, both SMI and Tobii demonstrated a 3-person social VR experience in which I hung out with other folks in VR and had a conversation. Tobii showed its technology integrated with the popular multiplayer world Rec Room while SMI allowed me to chat with someone in Seattle as if he was standing right next to me. Social interaction in VR with current consumer technology is fairly awkward. You can get some sense of a person’s interest via their hand and head movements, but to really connect with someone you need eye contact and both Tobii and SMI enabled that natural connection regardless of physical distance.

I wouldn’t say any of these technologies are consumer ready just yet, but they do show a sophistication, ease of use and affordability that we haven’t seen before. In fact, all the technologies mentioned in this post are being distributed to select developers as kits so they can start to build software around these upcoming advancements. FOVE is distributing a eye-tracking headset too. Meanwhile, both Google and Facebook have acquired eye tracking technologies within the last year — underscoring the expectation that the technology will power future headsets. It indicates that we are getting much closer to the realization of next-generation systems that will enable far more compelling and responsive virtual worlds compared with the ones we have today.

“I like to think of this as an extension of the development of the human-computer interface,” said Valve Developer Yasser Malaika, in an interview with UploadVR. “You started with command lines where you needed a lot of memorization, then moved to GUIs…now with VR we’re bringing more of the human body into it…your whole body the computer can now respond to. And adding eyes is another layer where it’s more responsive to you. It is the computer adapting to you rather than it asking you to adapt to it.”

Weekly Newsletter

See More