The High-end VR Room of the Future Looks Like This

by Sarah Downey • October 15th, 2016

Today’s VR systems are both fantastic and restrictive: they blow you away, but it’s clear how far they have to go. The HTC Vive is arguably the best out there, but having to buy a souped-up laptop just to run it, paying full price for brief games that feel more like demos, and trailing a huge cable off your head and fumbling to mount trackers on your ceiling…it’s not ideal. But it’s still incredible enough to give a taste of where it’s headed.

Here’s my best guess of what the future high-end VR setup looks like. I’m an early-stage VC focused on virtual and augmented reality, so I pieced this together based on the forward-thinking pitches and demos I’ve been lucky enough to see through my work, plus a lifetime of burning through sci-fi and video games. Check out the bottom of this post for a list of VR inspiration.

Side note that AR will be much bigger than VR, in both the diversity of use cases and market size (analysts predict $30B for VR versus $90B for AR by 2020), but I still believe that most homes will have a dedicated VR space for total immersion.

Body movement

Let’s start from the ground up. Forget the room scale debate: the VR setup of the future moves with you. Maybe it uses an omnidirectional treadmill that adjusts speed and incline based on viewer inputs. To be truly immersive, it needs to be around 8 feet by 8 feet, given that the average sprint stride length for men—the longest possible stride variant—is 93 inches. That gives users more than enough space to walk, run, and even sprint while in VR. Or maybe a section of the floor itself serves as the treadmill, raised up as a platform that controls pitch, yaw, roll, and speed.

omni

Of course, not everyone wants to—or can—be on their feet for long periods, and plenty of immersive entertainment, like watching movies, is sedentary. VR experiences will support a seated and reclining mode when appropriate and shouldn’t be more complicated than pulling up a standard chair. Movement in these modes will likely employ similar mechanics to those we’re beginning to see today, like teleportation via gesture or gaze.

Tactile feedback

Next up is the bodysuit. To mimic the tactile feedback that you experience in real life, you’ll need sensors and haptics all over your body or at least in significant areas, like the face, hands, and feet. Focused, acute pulses simulate sharp points; broader, more distributed ones can simulate sensations like dipping into water. For those who want to push immersion further, optional climate controls mirror environmental conditions (within a safe temperature range).

The first hardware generation attempting to solve the body feedback problem will likely use full bodysuits with haptic responses aligned to the VR experience. The suit’s gloves will simulate gripping objects by restricting finger movement: wrap your hands around a hard plastic cup in VR, and your gloves will freeze at the point where you can’t squeeze any further. Squishier objects will have more give. It’s possible that putting on a full suit will be too much effort for most people, and they’ll find that hand and facial coverage is enough to give them the immersion level they want. Humans have more nerve receptors in our fingers than anywhere else in the body (besides our feet and lips), so covering tactile input in the hands may be enough to make the mind suspend its disbelief while in VR.

hands

Control options

Our future VR setups won’t need controllers. Steve Jobs once said about using the human hand for interaction that “God gave us ten styluses…let’s not invent another.” Imagine the same touch and motion-based actions we’re used to on mobile phones, only happening in the air with our hands while we’re in VR. Need to hold something as part of a VR experience? Your gloves mimic the width and feel of a gun in a first-person shooter, the handle of a scalpel in a surgery simulator, or the stitching on a football…all while you’re empty-handed. Voice UI can supplement gestures with more detailed natural language commands.

img_7069

Weight is tougher to simulate in VR. The suit could stiffen and slow a user’s movement corresponding to the relative weight of an object: e.g., stooping to pick up a piece of furniture and would force a slow standup, versus an unaffected standup for a feather. Haptic feedback, movement speed in VR, and other techniques could add to the weight effect. For an in-depth discussion of the weight problem in VR today, see “Simulating Weight in VR.”

Eventually we’ll have the option to avoid climbing into suits at all. Neural signals can make users see and feel everything as real. Think plugging into the Matrix, but with the awareness and intent to do so. After all, we experience objective reality today filtered through our own senses; no two people see the same thing in exactly the same way, and we interpret our incoming sensory inputs as real. But despite the option to “jack in” in the distant future, plenty of people will still opt to use the treadmill and haptics combo for the exercise benefits.

img_7070

Visuals

The visual input is the most important piece of the VR setup. Right now we’re stuck thinking in terms of head-mounted displays (HMDs), like the Vive or the Oculus. But problems with frame rate lag in display and a small field of vision make people nauseated, and exercise in a headset is pretty unpleasant—picture a heavy piece of hardware bouncing on your face while you’re sweating into its lenses.

glasses

Future solutions will get rid of clunky wired headsets and move onto glasses that can project a high-definition image onto the eye, a la Magic Leap, and eventually contact lenses that contain tiny screens. We’d wear these contacts all the time, switching between AR mode (high transparency so you can see the world underneath digital overlays) and VR mode (low transparency so you can achieve full immersion). VR mode will likely happen at home or in private spaces, like offices for virtual meetings or the couch for gaming. AR mode will be everywhere else: calling a heads up display of Google Maps on the street, stopping to catch a Pokemon in a field, or scanning the person in the coffee meeting across from you to cross-reference their LinkedIn profile. Eye tracking will enable more realistic interactions with both NPCs and human avatars, along with detailed analytics, heatmapping, and privacy concerns.

Let’s move to the top of the head. Electroencephalograms (EEGs) can read user brainwaves, converting thought to action. Users might look at an object in VR and think about how they’d like to interact with it. Those interactions can be mapped to brainwaves and translated into action with almost no latency. Whether it’s an EEG housed in the full-body suit or a free-standing one atop the head, these devices will empower users to be hands-free in VR, especially when brain responses combine with eye tracking. VR developers are likely to support brain-computer interfaces in their experiences to add a wow factor: imagine actually having the force in a game like Mass Effect.

Sensory additions

Adding scent is another way to boost VR realism. Devices either on the user’s bodysuit or placed around the room will release smells tied to the VR experience. Companies like International Flavors and Fragrances have already mapped synthetic base smells that, when combined in different ratios, can create almost any scent. Startups will bring platforms to market that will let content creators add a scent layer to their work that a hardware peripheral will release at key moments. Running through Hyrule field in Legend of Zelda VR? You’d smell the grass. Users who want to add more realism could add fans to their VR lair to blow air to simulate wind and falling.

Audio is yet important sensory input to get right with VR. Whether the sounds come from headphones or speakers around the room, developers will build audio into their VR experiences so that the user’s movement and proximity changes corresponding sound accordingly.

img_7066

Devices running VR apps will be more powerful, smaller, and cheaper than our options today with wireless communication to the viewing apparatus. Maybe it’s a dedicated VR computer to start, but more likely it becomes mobile hardware, given Moore’s Law and the current trend toward everything mobile. Being able to pick up your VR hardware and take it with you is a big benefit.

In the VR room, whether mounted in the periphery or on a user’s bodysuit, you’d find super high-resolution sensors to track a user’s body in three dimensions and scanners to detect and render body positions and facial movements. Imagine a board meeting in VR where the people around the table can see each other’s real-life facial reactions and head leans.

Today, we’re seeing the very beginning of what VR technology will look like. I see a lot of companies and founders trying to take experiences we have today and port them over to VR without much change—like putting a standard browser window into a 3D environment, using the same navigation—but the innovative ones are thinking in entirely new ways about what’s possible. Now’s an amazing time to be alive, and I can’t wait until we’re all hanging out in the metaverse.

Here’s a list by technology type that includes both companies working on these problems today (in bold), plus some sci-fi inspiration for them (in italics).

This isn’t meant to be an exhaustive list, but if I missed something major, please tell me and I’ll add it. Also, please reach out if you’re working on anything cool in this space à sarah(at)accomplice(dot)co.

Hand and finger tracking, gesture interfaces, and grip simulation:

AR and VR viewers:

Omnidirectional treadmills:

Haptic feedback bodysuits:

Brain-computer interfaces:

Neural plugins:

  • The Matrix (film)
  • Sword Art Online (TV show)
  • Neuromancer (novel)
  • Total Recall (film)
  • Avatar (film)

3D tracking, capture, and/or rendering:

Eye tracking:

 VR audio:

Scent creation:

Sarah Downey is a principal at Accomplice, which is a sponsor of Upload.

Tagged with: , , ,

  • Folo

    Wow! Thanks for all the information and links!

    • Sarah A. Downey

      👍 Glad you liked it.

      • Sarah, thanks for great article:) Any chance to replace broken link for Teslasuit?

        • Sarah A. Downey

          Looks like the site’s down on their side. I can’t find one that’s live right now.

  • VR Geek

    Fantastic piece!!! One area not mentioned that also has lots of nerve receptors are our reproductive parts. In fact, it is this area that already has early haptics in the VR Porn space. The future of sex is unclear, but it is safe to say more of us will be getting off in the Metaverse year over year.

    • Sarah A. Downey

      Oh, for sure. Teledildonics should be its own article. Thanks for the post idea.

  • garyhayes

    One of the most concise, to the point – uncluttered with ‘tech comparison’, articles on where we are heading in #VR…

    • Sarah A. Downey

      Thank you, @garyhayes:disqus!

  • Vassgard

    Great article! But not even mentioning the IMAX VR with Starbreeze & Acers 210 degree FOV “5K” StarVR headset seems weird. Location based VR initiatives like The Void are becoming more and more common. Also tobii is pretty well known company for eye tracking! 🙂

    • Sarah A. Downey

      Thanks, @vassgard:disqus–I hoped people would fill me in on the big ones I missed. Appreciate it.

  • DonMac

    A very well constructed article, with some very useful resources, but I would observe that the author makes no attempt to pin her ‘best guesses’ to any time frame at all (next year, 5 year, 10 year…)

    I would also recommend caution, many of the more speculative predictions here I heard 25 years ago, during the first boom and bust of VR where the media industry drove the expectations of customers past the point that could be technologically delivered.

    Even the leading experts in the field such as Michael Abrash are showing a reluctance to make predictions past five years, especially with technologies that stray from a predictable graph of progress.

    It is good to dream, and fantastic to see so many companies working on innovative solutions but let’s keep it grounded in the possible.

    • Sarah A. Downey

      Thanks, and you’re right @disqus_4Xu35f5wfe:disqus that this was more a futurology piece than one on what’s definitely coming in a shorter time frame. I’d put some of the more advanced developments, like AR contacts, out 10+ years. I’ll write more on the concrete things happening now and within 5 years, so stay tuned.

      • Aditya

        Hey Sarah! Brilliant Piece. Any views on how the drone based photography could get refined? Say by 2020 olympics? I am looking for interface examples or more importantly how drones could be controlled, used to click pictures of events and at the same time update social media..A little complicated but would love you opionion or examples.

  • Swirling Blade

    What an awesome article! It’s the future of VR that makes me excited about it. I’m glad to see such a positive article, especially after reading a crappy and poorly thought out negative one from somebody else.

  • Jack H

    Also vestibular stimulation for simulation of acceleration.

  • Michael Carter

    Ready Player One meets Rainbows End