Skip to content

We "Tango with Autism" in VR on Google's Project Tango

We "Tango with Autism" in VR on Google's Project Tango

Last week at IO 2015, Google showed off the first development kit for Project Tango – a computer vision project that could have some serious implications for VR. The project is still fairly early, especially for VR, but that didn’t stop a number of developers (and some extremely tired Google employees) from getting together on a Saturday in Adobe’s offices in San Francisco and hacking on the device. I hadn’t ever had a chance to experience the Tango myself – so I moseyed on down to check out what everyone was up to and try it for myself.

The hack had only began that morning at 9am – so the developers (most of whom had never had any experience with the platform before) only had 9 hours to complete their projects. That being said a number of the teams I spoke to described the platform as very easy to pick up – meaning they were able to quickly focus on building their experiences.

One of the most impressive ideas that came out of the hackathon was from Cory Corvus (who was last seen making it rain with Palmer Luckey) and Manny Marquez. They decided to take the opportunity to “Tango with Autism” with their entry and create an experience meant to help teach simple life skills to those with Autism spectrum disorders using VR.

Tango with Autism - screenshot

Due to the time constraints of the hackathon, the team decided to stick with a single simple life skill experience as a proof of concept – crossing the street. Given that this was a Tango hack and the team had an extensive VR background the subject they chose made tons of sense as it allowed them to explore walking in mobile VR.

Using a Durovis Dive 7 and a Tango I was able to get a rough glimpse into the future of mobile VR – a future that is going to be a lot smaller than a 7″ tablet on your face soon. I say rough because the Tango, as is, is not ready for VR. The optimizations haven’t been done, for example the latency is at about 80ms – which is about four times that of the DK2 and way too slow for vomit bag free VR. But speaking with Corvus he is confident that a “few simple software optimizations” could cut that latency in half – which is still too slow. The human eye can detect anything greater than about 20ms of latency – for VR to be as comfortable as possible, it has to beat that mark and right now the Tango doesn’t.

https://www.youtube.com/watch?v=h6M9uAOrL5M

“Tango with Autism” Demo

But I have my VR legs under me so I braved the latency and tested the headset out for myself – and was actually pretty impressed. Moving past the latency (I’ll give it a second to catch up), the experience of walking in VR without wires was pretty sensational – until you moved around too quickly.

Luckily for me, Corvus and Marquez weren’t going to have me moving around too fast. Turning to my left I met the adorable hippo who instructed me (using a voice recorded by Marquez) to look both ways, before eventually leading me across the street. In real life, Corvus stood alongside me as I walked across the street with the hippo. As I walked I definitely felt a little bit of the jitter and something about the experience definitely felt a little less secure than the Vive, for example, but still I felt a big smile on my face because I was walking in VR without wires – something I had been waiting for ever since I strapped a headset for the first time.

Computer vision and virtual reality optimizations still need to be made for this project to be truly viable for VR but it is the route that we need to take – both for VR and for AR. Speaking with a Google Tango employee at the event, I was told that for Tango “VR is not a top priority, but is something we are working on with the project.” It would appear that for Tango, solving other vision problems – like drone object avoidance for example that if properly implemented could disrupt the entire shipping industry – currently have a little higher spot on the task list, but the work the team is doing can be directly applied to solving the mobile tracking issue. It may be a race to the finish between the growing computer vision team at Oculus and the team at Google Tango to see who solves mobile positional tracking first.

Update: You can download “Tango with Autism” for yourself here

Weekly Newsletter

See More