ResearchVR Episode 29: Did Somebody Say Hand Interaction? Round-table With Alex Colgan

by ResearchVR • January 11th, 2017

This week on ResearchVR we dig deep into Leap Motion with Alex Colgan, lead writer at the startup.

This week’s episode is a little unusual. Considering Upload’s recent hands-on session with TPCAST and the ResearchVR episode on challenges of wireless VR, we start our discussion with a quick recap on this topic. Then we move on to our guest, the person with the overview of what Leap Motion is cooking up, their experience, and best practices to date.

Episode Preview

The most exciting technology this hand tracking company is working on are embedded sensors for mobile VR. You can be skeptical about it, however, all things are pointing toward a useful solution with FOV as big as your real world, low latency, and high tracking flexibility – all confirmed by Azad’s hands-on experience with the prototype.

Considering that Leap Motion is based purely on video input, they also have a lot of experience with the embodiment of hands based only on visual feedback. We had a heated discussion about WOW effect bias, replay value, anticipation, and whether or not it all leads to diminished experience fidelity.

Last but not least, we also dig neck-deep into VR GUI topics. In his recent article on the subject and our discussion on the podcast, Alex compares where we are with user experience and expectations towards GUI on desktop versus in VR. It turns out, in the world of desktop, we’ve evolved beyond skeuomorphism (physical metaphor), while in VR we still need more direct cues. However, extrapolating from the desktop learning curve, soon VR interfaces will be far more advanced.

Learn more in Episode 29 – Did Somebody Say Hand Interaction? Round-table with Alex Colgan.

Do you have any questions for us? Ideas for future guests or future discussion topics? Let us know on Twitter and down in the comments below!

Tagged with: , , , , , ,

  • Alex Colgan

    This interview was a blast 😀

  • unreal_ed

    So I think that what UI needs to do has changed considerably over the past 20-30 year. Back in the early days, like in the 90s, people didn’t know necessarily how computers worked. Skeuomorphism makes sense in that context as you want people who don’t interface with computers to be able to recognize the icon of a mailbox so you make it look like a real mailbox.

    Nowadays, people know that if you put a very flat icon that SYMBOLICALLY represent an envelop, you mean “this is the icon to go to your email”. Nowadays, what you want from UI is to have very symbolic, very minimal icons so that they have clear silhouettes and, often, just 1 color. In fact look at the icons for this comment system !

    What I think will be the future of UI in VR will be more centered around interaction design. SoundStage shows a lot of what can be done with good interaction design. To load in a sound to be played, you don’t click a button and see a list of possible sounds. You spawn a cassette, and slide it in to a cassette player, and swap it with another one if you want to change a sound. This kind of interaction replaces a number of UI interactions, and I think we’ll see more of it in the future.