The “Year of Virtual Reality” is upon us finally, and with CES literally days away, the buzz surrounding this industry has reached an all time high. We are so close to finally seeing consumer VR becoming a realized dream, with the Oculus Rift CV1 release “months, not years” away and many pundits suggesting that date will be some time in May. Lest we allow ourselves to put the cart before the horse, the question remains, is virtual reality ready for consumers? In this article I will outline the remaining problems that face virtual reality, and ones that have already been solved. The article is intended as an overview for those who are new to the VR space and are wondering what hurdles the technology faces before it becomes viable for consumers.
The objective of virtual reality is to make the user feel a sense of presence in a virtual environment. This involves tricking as many of the user’s senses as possible. Clearly this is an extremely challenging problem. We are not able to wire the user’s brain a la the matrix yet, so we have to settle for some limitations. The most essential sense to trick is vision, and the quintessential device to do this is the head-mounted display (HMD). An HMD is a hardware rig that includes sensors to track the position and orientation of the user’s head and has one or two screens that render a separate image for each eye to provide a sense of depth. The images are of 3D virtual environments and are rendered from slightly different perspectives to simulate the distance between the eyes. This is what gives a sense of depth that cannot be perceived from a flat TV or computer screen. The sensors in HMD track the orientation and position of the user’s head so the user can physically move their head to look around the virtual environment.
The features of an HMD sound simple enough, but performing each function well is one of the great challenges of VR. The brain has an uncanny ability to detect discrepancies between the way we are moving and what we see. If the two are not consistent then our sense of presence in the virtual environment can be broken and we can experience discomfort in some form. One of the more severe consequence that can occur is motion sickness.
If you have been following the progress of consumer VR hardware at all, you are most likely aware that HMDs can cause motion sickness. Motion sickness, also known as simulator sickness, has been one of the worst problems plaguing consumer VR. It is worth mentioning that some people get simulator sickness even when they play computer games on a normal monitor, but consumer VR in its current state is much more likely to make the user sick. The causes of simulator sickness are high latency, lack of or poor positional tracking, and movement in the virtual world that is inconsistent with what the user’s body is doing. Fixing each of these problems will eliminate simulator sickness for the majority of users. In the next three sections I explain each of these problems and what is being done to solve them.
When you wear a HMD and you move or rotate your head, inevitably there is a delay between the time that you move your head and the time that the image on the HMD screen gets updated to reflect the movement. This delay is known as latency. The HMD sensors first detect the movement and then must send it to the machine running the VR software, which must then render the image of the virtual world again twice (once for each eye). If this process takes too long, there is a good chance that you will get simulator sickness, or at the very least your sense of presence will be reduced. Often high latency is caused by the framerate of the VR application being too low because the graphics hardware in the computer is not powerful enough. As stated above, a VR application must render a separate image for each eye which is expensive in terms of graphics processing power. Therefore a powerful computer is required to run many VR applications.
Even with expensive machines, VR hardware and software engineers must go through painstaking efforts to make sure latency is as low as possible. Algorithms such as one known as “Time Warp” have been devised to reduce latency. Also, the Samsung GearVR allows VR applications to have priority on the Android OS so that latency can be reduced. If you have a chance to try the GearVR, you will see that the latency they have achieved is quite impressive.
The Oculus Rift DK1, which was the first Oculus Rift developer kit shipped to consumers in 2013, tracked the orientation of the user’s head but not the position. This meant that the user could turn their head to look around the environment but if they moved their head forward, back, up, or down that movement would not be reflected on the display and this would cause simulator sickness. Oculus addressed this problem by adding positional tracking support in the Oculus Rift DK2. The DK2 features a depth camera that faces the user and keeps track of the position of the HMD using infrared LEDs that are built into the front of the HMD. This works pretty well, but it is not difficult to accidentally move your head outside of the range of the camera and lose positional tracking. It will also lose track if you turn your head completely away since there are no LEDS on the back strap of the HMD. The latest prototype of Oculus, the crescent bay, features LEDs on the back of the user’s head to enable 360 degree positional tracking.
In addition to reducing simulator sickness, positional tracking gives developers the opportunity to create more novel applications, especially games. Developers can create games that involve the user moving their head to dodge objects, or to peer around corners. Positional tracking is not absolutely essential for all VR applications. There are some applications for which positional tracking is not essential. The GearVR does not include built-in support for positional tracking, and there are still many great applications being created for that device. With the absence of positional tracking the chances of simulator sickness are increased, but in experiences where the user does not feel compelled to move their head this is less likely. Note that there is additional hardware, such as the sensors included with the Sixense STEM system, that can be used to add positional tracking support to the STEM system.
The positional tracking support in current consumer VR device prototypes may not be perfect, but clearly it is a problem that engineers are close to solving.
When the user’s perspective is rotated or accelerated in the virtual world, and their body remains stationary in the real world, there is a good chance the user will get nauseous. For this reason Oculus encourages developers to build sit-down experiences that do not involve too much acceleration or turning. While some novel experiences can be created that do not have acceleration or turning, this limitation excludes many traditional game genres such as first person shooters. There is such a demand for these applications that developers will continue to create them regardless. Unfortunately there is no easy solution to this problem. The obvious approach is to create a treadmill system such as the Virtuix Omni or the Cybertih Virtualizer. These devices hold the user up, and allow them to slide their feet across a slippery surface in order to give them the impression that they are walking and turning in the virtual world. These systems are a step in the right direction (no pun intended), but currently they are expensive and take up a lot of space. Whether most VR enthusiasts will be willing to invest in them remains to be seen.