ResearchVR Episode 29: Did Somebody Say Hand Interaction? Round-table With Alex Colgan
This week on ResearchVR we dig deep into Leap Motion with Alex Colgan, lead writer at the startup.
This week’s episode is a little unusual. Considering Upload’s recent hands-on session with TPCAST and the ResearchVR episode on challenges of wireless VR, we start our discussion with a quick recap on this topic. Then we move on to our guest, the person with the overview of what Leap Motion is cooking up, their experience, and best practices to date.
The most exciting technology this hand tracking company is working on are embedded sensors for mobile VR. You can be skeptical about it, however, all things are pointing toward a useful solution with FOV as big as your real world, low latency, and high tracking flexibility – all confirmed by Azad’s hands-on experience with the prototype.
Considering that Leap Motion is based purely on video input, they also have a lot of experience with the embodiment of hands based only on visual feedback. We had a heated discussion about WOW effect bias, replay value, anticipation, and whether or not it all leads to diminished experience fidelity.
Last but not least, we also dig neck-deep into VR GUI topics. In his recent article on the subject and our discussion on the podcast, Alex compares where we are with user experience and expectations towards GUI on desktop versus in VR. It turns out, in the world of desktop, we’ve evolved beyond skeuomorphism (physical metaphor), while in VR we still need more direct cues. However, extrapolating from the desktop learning curve, soon VR interfaces will be far more advanced.
Do you have any questions for us? Ideas for future guests or future discussion topics? Let us know on Twitter and down in the comments below!