Hands-On at CES With Intel’s Project Alloy Standalone VR Headset

by Ian Hamilton • January 5th, 2017

Project Alloy from Intel is a prototype VR headset with important new features from one of the world’s most influential tech companies, and we’ve just tried the first hands-on demo of the hardware at CES.

Alloy sits in the same standalone category as the Santa Cruz prototype from Facebook’s Oculus, meaning the hardware you wear on your head includes not just the display, but the rendering and positional tracking technologies that are fundamental parts of making VR work. Unlike the Oculus Rift and HTC Vive, no outside hardware, sensors, or cameras are needed.

Developer kits for Alloy are already in the hands of Intel’s partners and the company expects it to be turned into a consumer product by the end of the year. It is a bit of a heated race for Intel, because Microsoft already announced partners working with the tracking technology it developed for HoloLens. This critical technology is a prize for Microsoft, developed over a number of years, and both Facebook and Google (along with many others) are racing to match it. Heading into an era of mixed reality, if Intel is to retain its position as a supplier of fundamental technology for a wide range of manufacturers, it needs Project Alloy and its tracking technology to be a solid platform upon which partners can build.

So how did it work? I am one of the only people in the world to have tried both Facebook’s prototype and Intel’s, so I have some perspective others don’t. That said, my time in each headset was extremely limited, the prototype is in ongoing development, and my impressions are totally subjective. So keep that in mind as you read on.

Intel Merged Reality

I can’t make too many conclusive statements about Intel’s technology, except to say that it won’t deliver an experience that feels anything like the one depicted in the “merged reality” video below anytime soon.

Instead, the Project Alloy demo I experienced “drifted” considerably. If it had been me wearing the Project Alloy prototype in the video above, I would’ve walked into a door.

Intel said it has the technology to scan a room while the headset is on, but in my demo it had been scanned beforehand. This scanning process should allow software to dynamically mold itself to the physical geometry of the room.

For my demo, a physical table in the center of the room became some kind of a glowing energy portal in the world I saw inside the headset, while the furniture around the perimeter became crates. The walls were gone and the environment turned into a platform in the desert, similar to the VR game Hover Junkers, and I was free to walk around the platform.

I don’t recall a physical object in the middle of the room in the Santa Cruz demo, so there is no point of comparison here, but in the Project Alloy demo this object’s location drifted considerably as I moved around the room. What this means is that the relative location of the table and its virtual counterpart “drifted” apart. Result: I bumped into the table. I could reach down and touch it with my hands, but the spot where my sense of touch is telling me the object is turned out to be about a foot off from what my eyes were telling me. Intel suggested the drift might have been caused by the number of people in the room.

The controller I held in my hand featured limited tracking, on par with the Daydream controller with only three degrees of freedom. I could move it left, right, up and down, but if I wanted to move it back and forth that movement wasn’t reflected in the virtual world. However, unlike Daydream, my head could move back and forth. This means that in the game I tried, airborne attackers come and you need to point the controller at the invaders and pull the trigger to shoot them.

Additional ammo drops arrived when I ran out, and I should have been able to reach out and grab it, except the controller can’t do that. Instead, I had to walk closer to the ammo to grab it, which wasn’t very intuitive. Intel said it is offering a separate demo at CES with full tracking freedom on the controllers, but we haven’t had the chance to try it yet.

The weight of the headset was well balanced with battery in back and processing up front, with no pressure felt on the front of my face. I could feel the heat coming off the headset by hovering my hand about half an inch above its surface in the front. After a very short session in VR, my face was sweating even though I wasn’t very active. Intel attributed it to their placeholder facial interface, which didn’t let air breathe around my face.


Facebook’s Santa Cruz versus Intel’s Alloy

Though my demos were separated by several months, I believe the Alloy prototype I tried was much heavier than Santa Cruz. The Oculus demo, however, seemed to have a sparser environment. This makes sense as Alloy is a close relative of PCs, being powered by Intel, while the Santa Cruz prototype is a mashup of its mobile Gear VR and the tethered Oculus Rift.

Santa Cruz only had a small noticeable tracking hiccup during my demo, though spotting drift was difficult without the clear point of reference seen in the Intel demo. Overall, I found myself being pretty timid in Alloy for fear of bumping into something while I walked confidently from one side of the room to the other in Santa Cruz because the room was empty.

Intel says the technology will get lighter and improve in pretty much every aspect on its path to becoming a consumer product, with “hours” of battery life being the target. We’ll track down more demos to see the other pieces of Intel’s VR-related technology, and we will follow up with a longer video as soon as possible, but I will frankly be (pleasantly) surprised if it all comes together in a consumer product this year.

Tagged with: , , , ,

What's your reaction?
  • Kevin Walker

    The tech sounds good, and I can see many uses for it. If it can track objects in real time you could have a drink without having to take off your HMD, or see the cat you’re about to step on. But as far as using it in games goes, I can’t see the point. I put on my Rift to escape my living room. Just like with AR, why would I want to play every game in one location. What’s the next level going to be? Oh it’s my living room again! Clear some space and play in any location you want.

    • Steebie

      This is exactly what I tell people that read Tim Cook in the news. AR and VR occupy two separate spaces. AR would be great for, say, a technician going through an engine bay that has all of the hoses, wires and parts labelled with overlays of instructions or a parts clerk looking for a piece of equipment in a vast warehouse, but VR can take you places other than where you actually are physically located.

    • DougP

      Re: “you could have a drink without having to take off your HMD, or see the cat you’re about to step on.”
      People don’t need to wait for full room/environment scanning/tracking….The Vive’s camera is great for solving for those situations you mentioned.

      • Andrew Carr

        Honestly I think people forget or straight up didn’t know the Vive can do that. First time I tried out my friend’s Vive I said it would be really cool if you could actually see the room still to get a better orientation of your surroundings and he agreed. I ended up borrowing his Vive for a week and during the setup I saw an option to see the room instead of just the grid when you get too close to a wall and another option to pull up a tiny room view when opening the Vive menu. After telling my friend about it he said he now remembered that from the setup but found it too distracting while gaming so he turned it off and forgot about it. I ended up agreeing that it was a bit distracting for the grid but the tiny room view was perfect for grabbing something without having to take off the headset.

  • The work that the Intel engineers are doing is incredibly exciting but I sometimes wonder if they weep when they see a marketing video like this.
    I like that the video grabs your attention but it demonstrates a common point of confusion regarding mixed reality. Whereas in a game (which might be in VR) you can change the narrative that’s harder to do when you’re augmenting reality because the reality bit keeps going on doing what it was doing.
    Anything you do will cause you to increasingly diverge from reality. This is why you can’t, as people once hoped, drive your virtual car in a real F1 race in real time and get sensible results (e.g. what happens if you crash into the winner on the first corner?).
    Arthur C Clark once wrote a short story about time travel where someone went back to look at Dinosaurs and trod on a butterfly. When they returned there was a different president and other awful things had happened. Same notion really.

    • No Spam

      Great reference, but FYI the story is “A Sound of Thunder” by Ray Bradbury (not Arthur C. Clarke)

      A highly recommended and short read, and I’ve seen it online in PDF form via school curricula. Worth a Google.

      • You’re absolutely right. Thanks for that.
        Going back to the video I think Ian is right to start probing the hype surrounding AR and VR because it isn’t helping. Honest manufacturers find it very hard to compete with smoke and mirrors..

  • GodMk2

    I have an issue with AR using room tracking. Say you are playing a FPS game like Onwards and are walking across the desert. How do you deal with your dining table? Why is a crate following you about? Assume you playing wave shooter with less movement possible then yes, you can make real objects become cover items, but look at how many negativee comments wave shooters are already getting. It seems this tech to track the fixed real world objects could be it’s biggest negative point when it comes to gaming. For AR type applications… a clear lens, with objects merged over the top of reality seems a better solution. Then go play out in the street and use a dustbin / neigbours car / tree as over. But Bear in mind a virtual Barret .50 cal (as well as a real one), can stick a bullet through a concrete block wall 😉 VR – needs a clear holodeck/room scale type place to play. Or a system where you can walk on the spot using fitbits or the new Vive trackers on your feet for kicking and strafing.

  • Konchu

    Glad to see the gun tracking demo here doesnt look bad interested to see how this plays out.

  • Thanks for this review, it is super super interesting, since you have done this comparison. And from this review, Intel comes out pretty bad.
    All these tracking glitches are bad… and since Intel and Microsoft are partnering a lot, they could have used Microsoft tracking technology that in Hololens works like a charm