Quest Pro and Quest 2 can do some pretty cool things but the software continues to be maddening.
Here’s some examples of the features available on Meta’s standalone headsets:
- Quests allow the use of multiple Web browser windows, traditional Android APKs, and even several PC monitors in VR.
- The headsets support basic multitasking like the PCs that Meta’s platform promises to replace one day.
- The Touch and Touch Pro controllers can be used on a flat surface to sketch out ideas and post them on a planning board in a similar manner as traditional paper sticky notes.
- You can connect a keyboard and mouse and type without needing to take the headset off and use the cursor to manage browser windows.
- There’s a new setting to enable background audio in VR apps so you can listen to your tunes while exploring a virtual world.
- You can see a passthrough view of your desk if you want to look down and grab a cup of coffee.
- You can even bring your whole room into VR for incredible next-generation mixed reality that believably incorporates your walls and furniture into dynamic experiences that feel closer to Augmented Reality than Virtual Reality.
All of these powerhouse features are technically there in Quest Pro and Quest 2 today. These are awesome ideas and major steps forward for VR as the next generation of personal computing, pushing far past things like floating keyboards you point at with lasers or the passthrough flashlight present in Microsoft’s seemingly abandoned Windows Mixed Reality platform. But to say any of Meta’s most forward-looking features are broadly usable would also be laughable. At present, Quest’s best ideas are buried in menus, experimental settings or siloed within apps that don’t share their functionality elsewhere. The hardware is cool, but to use a Quest Pro as of this writing is an exercise in losing time and patience with confusing software.
For instance, why do I need to install and launch a meetings app to access my PC? And why does this take me to a separate virtual office environment to the one I’m in when using the built-in browser?
Take this menu as another example:
The simple question of “how do I see my keyboard in VR” is answered in very different ways by different apps. There’s also a range of passthrough options supported in software like Immersed VR and Horizon Workrooms. These apps let you see through your headset’s cameras to look down and see your desk in the case of Workrooms or, in the case of Immersed, place a shape in your environment wherever you want to see the physical world.
Meta also supports a handful of tracked keyboards across its system including the Logitech K830 and Logitech MX Keys. The K830 includes a crummy trackpad that, however buggy, is a built-in nicety as you don’t have to reach around blindfolded for another piece of equipment on your desk to run a cursor from one virtual Web browser window to the next. The MX Keys, however, can be connected to three separate devices. This is convenient for certain setups where you’re going back and forth between different computing devices. I can, for example, press the “1” Bluetooth key to tell MX Keys to connect directly to my headset for use in “supported apps” like the home environment with its Web browser. I can then press the “2” Bluetooth key to connect the keyboard directly to my PC for “supported” remote desktop apps.
These options make sense — there’s a lot of flexibility here for developers. In practice, though, for users this requires repeatedly and regularly navigating Quest’s labyrinthian menus, an encyclopedic knowledge of which apps support the tracked keyboard, connecting the keyboard to the right device, and using the right guardian mode for the task in question. There are also three entirely separate safety systems: Guardian, Space Sense, and Room Setup. Worse, Guardian has four separate “modes”: Stationary, Roomscale, Desk, and Couch.
There are also three “personal” spaces on Quest — one each inside Workrooms, Horizon Worlds, and the core Horizon Home built into Quest. Each space carries different features with Worlds offering comfort settings, Workrooms requiring I sit at my desk, and Home offering multitasking.
Say I’m using Horizon Workrooms on Quest Pro and lean backward in my chair while I’m sitting at my desk. I’m often jarringly moved into a view of my surrounding room and removed from my focused VR workspace as Quest tries to figure out if I’m at my desk or I want to switch to a stationary guardian. Do I lean forward to fix this? Do I move my chair? Do I redo my desk setup while sitting in just the right place? And why does Quest Pro keep asking me to check my fitting as I work through this?
My colleague Harry Baker had a nice editorial earlier this year about Apple approaching this space soon and its likely advantages over Meta. His take focused largely on Apple’s potential here to advance the market, while I wanted to break down why it’s so frustrating to see Meta test so many ideas without proper explanation. There’s immense flexibility granted a power user on the Quest platform at the same time people can be driven nuts by the staggered near-constant roll-out of new or slightly changed features which arrive seemingly randomly with little explanation of what changed and when.
It’s likely that the “Room Setup” feature on Quest will eventually subsume all the other safety systems — and that’ll go a long way toward fixing some of these problems. It’s my hope, though, that Meta can take the great ideas on display in places like Horizon Workrooms and merge them with the multitasking work in Horizon Home. And Meta should do a better job rolling out new features with clear explanations when they’re seen inside the headset for the first time instead of burying them inside update text on its website.