Intel’s ‘Merged Reality’ Demo Brings Actual Hands Into VR on Oculus Rift

by Joe Durbin • January 6th, 2017

The bleachers at Intel’s CES sprawling booth at the Las Vegas Convention Center were packed to the gills with eager technophiles waiting to see something incredible on Thursday afternoon. As it turns out, they’re rewarded for their loyal presence when Aneet Chopra from Intel takes the stage and begins to explain the concept of “Merged Reality.”

To begin his remarks, Chopra plays the following video:

That video is four months old and primarily serves to position Project Alloy: Intel’s first attempt to design its own, self-contained virtual reality headset. During this presentation however, Chopra would not be focusing on Alloy at all. In fact, his main demonstration would use an Oculus Rift, retrofitted with Intel’s RealSense Camera.

Chopra invites an eager Intel colleague onto the stage and invites him to don the jerry-rigged helmet. The man obliges and soon the large screen behind him is filled with balloons. Now, I like balloons just as much as the next guy, but what appeared on screen next was perhaps even more impressive.

Chopra asked his colleague to show his hands and soon they appeared on screen surrounding the balloons. These were not an approximate version of this man’s mitts created by some sort of VR controller or projected onto a digitally created avatar. No, these were his actual, real world hands — complete with wedding ring. That in and of itself is enough to raise an eyebrow or two but Chopra and co. then took it a step further.

Chopra’s assistant began to move his hands towards the virtual balloons he was seeing in the headset and he was able to bat the balloons away with what appeared to be a very narrow degree of tracking precision. The plot again thickened when this man began manipulating the balloons with his fingers as well.

Not only were his real hands inside VR with him, but they were being tracked down to the individual digits.

This avatar and controller free hand tracking experience is vital to what Chopra called the “pillars of Merged Reality.”

These are: an untethered (wireless) headset,  freedom of movement, integrated tracking (inside-out), and natural manipulation. The latter pillar is what Chopra and his cohort were demonstrating on stage. According to Chopra, Alloy will be the first headset to incorporate this type of technology natively and begin ushering in the age of Merged Reality.

We came away with a mixed reaction from our actual hands-on demo of the Project Alloy using Intel’s RealSense Camera at CES, but it remains to be seen how this technology will be implemented in the industry as a whole.

Tagged with: , , , , ,

  • Konchu

    I do like this…But my concern is the tracking that which is not seen aka hand tracking being limited to where you are looking. This could possibly limit some interactions waving above the head, crossing fingers behind back, etc. As will as limit its ability to track game controllers one of the cool things in VR is I can aim a gun blindly or fire from the hip. But is good to see they have at least an option for tracking hands.

    • MR Not Dux OSAR

      Good points with the social interaction angle. I would never have thought of that as I mostly game flight sims or single player rpgs.
      That being said, for me personally I love the idea of being able to access radar buttons or switches in a VR flight sim. Also having access to a digital keyboard. This seems to basically allow us touchscreens in a VR setting.

      Sounds amazing.

      • DanCarmon

        I think that the latency happened because of the video feed being projected to the screen. I saw it numerous times happening in other booths in CES, where what I saw in the HMD was shown in delay on the outside monitor.

    • Allan

      Also, in the videos there does appear to be more than a little latency. Sure hope they can resolve that.

      • DanCarmon

        I think that the latency happened because of the video feed being projected to the screen. I saw it numerous times happening in other booths in CES, where what I saw in the HMD was shown in delay on the outside monitor.

    • Graham J ⭐️

      Agreed. Hand tracking may (or may not) prove to be the holy grail of immersive interaction, but it can’t be limited to the headset’s point of view. Gloves and controllers will do a better job of this.

      • RedLeader

        Actually, what would be best is integrating this tech on big sensors like the Vive Lighthouse ones so that the sensors themselves could reliably track the hands. No controllers, no gloves needed. We’re still a few years off from that, but in 2014 Oculus acquired Nimble VR and you can bet that they’re definitely working on that.

        • Graham J ⭐️

          For set up rooms definitely, though Intel seems to be focused on fully mobile. I’m not sure why; the tradeoffs currently required to go fully mobile aren’t worth it IMO.

  • Cool, but I don’t get what is the advantage of seeing my true hands… everything surrounding them is fake, so having my true hands can be presence-breaking (it makes me notice that the surrounding stuff is not true).

    • RoJoyInc

      seeing hands presence breaking? vs no hands – just floating controllers = real presence ?

      • Maybe I explained myself bad: hands are great of course, but the fact of seeing my hands as seen by the camera on top of a completely CG environment can break presence. It’s like in Roger Rabbit movie… part cartoon and part real images.

    • RedLeader

      You have absolutely no idea what you’re talking about. Seeing your hands in VR = more immersion 100% of the time. I speak from experience with 2 Rift headsets, a Leap Motion, and Oculus Touch kit. Seeing and using hands = more natural interaction.

      • Man, I have a clear idea on what I’m talking about. I’m not talking about using hands in VR: this surely augments presence. I’m talking about showing my actual hands, taken from the frame buffer of the camera. It is like showing a video stream inside a cartoon environment… like in Roger Rabbit movie…

        • RedLeader

          I think this explanation makes much more sense and I can kind of see your point now. We’ll have to wait and see.

  • Graham J ⭐️

    This is more or less what they showed of Alloy before – namely depth mapping, isolation of a plane, and interaction of that plane with the virtual world. A Kinect strapped to any headset could do this, and Leap Motion already does much better (albeit without colour hands but adding a camera is easy enough)

    What everyone is looking to RealSense for is acurate inside-out tracking and they still haven’t nailed that.

  • Eric Nevala

    How is this better than Leap Motion? Have they solved the tracking frustum problem?

  • Little RED

    Wow. Keep going….

  • care package

    hand tracking is great for certain applications, but until users want to ditch gaming that requires triggers and button presses, hand tracking will be of no use for such applications. Hand gestures to mimic button presses is not practical either.

  • Jean-Daniel Taupiac

    This is Augmented Virtuality … absolutly NOT Merged Reality (aka Mixed Reality) !
    You just bring real elements (hands in this example) in virtual world, but you don’t have a Mixed environment where all real and virtual objects interacts in real time !
    How Intel could do this mistake ?