Explorations in VR Design is a journey through the bleeding edge of VR design – from architecting a space and designing groundbreaking interactions to making users feel powerful.
Sound is essential for truly immersive VR. It conveys depth and emotion, builds and reinforces interactions, and guides users through alien landscapes. Combined with hand tracking and visual feedback, sound even has the power to create the illusion of tactile sensation.
In this Exploration, we’ll explore the fundamentals of VR sound design, plus take a deep dive into the auditory world of Blocks. Along the way, we’ll break a few laws of physics and uncover the surprising complexity of physical sound effects.
What Can Great Sound Design Achieve in VR?
Presence and Realism in 3D Space
When it comes to depth cues, stereoscopic vision is a massive improvement on traditional monitors. But it’s not perfect. For this reason, sound is more than just an immersive tool – how (and where) objects around you sound has an enormous effect on your understanding of where they are, especially when you’re not looking at them. This applies to everything from background noises to user interfaces.
Engines like Unity and Unreal are constantly getting better at representing sound effects in 3D space – with binaural audio, better reverb modeling, better occlusion and obstruction modeling, and more. The more realistic that zombie right behind you sounds, the more your hair stands on end.
Mood and Atmosphere
Music plays a crucial role in setting the mood for an experience. Blocks has a techno vibe with a deep bass, inspired by ambient artists like Ryuichi Sakamoto:
Weightless features soft piano tracks that feel elegant and contemplative:
Finally, Land’s End combines a dreamy and surreal quality with hard edges like tape saturation and vinyl noise.
If you imagine shuffling the soundtracks in these three examples, you can understand how it would fundamentally change the experience.
Building and Reinforcing Interactions
Sound communicates the inception, success, failure, and overall nature of interactions and game physics, especially when the user’s eyes are drawn elsewhere. Blocks, for example, is designed with a wide range of sounds – from the high and low electronic notes that signal the block creation interactions, to the echoes of blocks bashing against the floor.
For game developers, this is also a double-edged sword that relies on careful timing, as even being off by a quarter second can disrupt the experience.
Tutorial Audio
It’s sad but true – most users don’t read instructions. Fortunately, while written instructions have to compete with a huge variety of visual stimuli, you have a lot more control over what your user hears.
Using the abstract state capabilities in Unity’s Mecanim system, you can easily build a flow system so that your audio cues are responsive to what’s actually happening. Just make sure that the cues work within the narrative and don’t become repetitive.
Setting Boundaries
Virtual reality is an exciting medium, but for first time users it can take a few minutes to master its limitations. Our hand tracking technology can only track what it can see, so you may want to design interaction sounds that fade out as users approach the edge of the tracking field of view.
Evoking Touch
In the absence of touch feedback, visual and auditory feedback can fill the cognitive gap and reinforce which elements of a scene are interactive, and what happens when the user “touches” them. This is because our brains “continuously bind information obtained through many sensory channels to form solid percepts of objects and events.” Some users even describe phantom sensations in VR, which are almost always associated with compelling sound design. To achieve this level of immersion, sounds must be perfectly timed and feel like they fit with the user’s actions.
Sound Design in Blocks
We’ve already talked about its ambient-inspired soundtrack, but you might be surprised to learn the sound effects in Blocks were one of our biggest development challenges – second only to the physical object interactions, an early prototype of the Leap Motion Interaction Engine.
Magic and Progression
One of the core design ideas behind Blocks was that we never imply a specific device underneath anything. For example, there are no whirring or mechanical noises when the arm HUD appears. It’s just something that magically appears from nowhere. The block creation sounds are also minimal, suggesting a natural progression. This was central to the narrative we wanted to tell – the miraculous power to create things with your bare hands.
This philosophy was also reflected in the physical sound effects, which were designed to suggest the embodiment of the object itself, rather than a specific material. When you grab something, a minimal subtle clicking sound plays. Nothing fancy – just tactile, quick, and precisely timed.
Getting the Right Physical Sound Effects
Here’s where it got challenging. To ensure a natural and immersive experience, the physical sound of block impacts is driven by 33 distinct effects, which are modified by factors like block sizes, collision velocities, and some random variations that give each block its own unique character. This aspect of the design proved nontrivial, but also was a fundamental component of the final product.
Since the blocks don’t have a representative material (such as metal or glass), finding the right sound took time. In creating the Blocks audioscape, sound designer Jack Menhorn experimented with kitty litter, plastic jugs, cardboard boxes, and other household objects. The final sound suite was created by putting synths into cardboard boxes and slamming the boxes into each other.
Violating the Laws of Physics
In an abstract environment with simple geometry, sound design is the difference between disbelief and physical presence. Sometimes this involves breaking the laws of physics. When you have to decide between being accurate and delighting your user, always opt for the latter.
In the real world, sound follows the inverse square law – getting quieter as the source gets farther away. The Unity game engine tries to reinforce this real-world falloff. But a block that lands silently after being thrown a long distance isn’t very satisfying. With Blocks, we created a normal falloff for a number of meters, but then the falloff itself stops. Beyond that point, blocks sound to be the same volume, regardless of how far away they are.
At the same time, the reverb goes up as blocks get farther away – creating an echo effect. In the real world, this would be impossible, since there are no walls or anything in the space that suggests there should be reverb. This is all just part of setting the rules for the virtual worlds in ways that feel human, even as they violate the laws of physics. So far no one has complained, except maybe this guy:
The Future of VR Sound Design
Imagine all the different ways that you can interact with a coffee mug, and how each action is reflected in the sound it makes. Pick it up. Slide across the table. Tip it over. Place it down gently, or slam it onto the table. All of these actions create different sounds. Of course, if it breaks, that’s a whole other problem space with different pieces!
This is the root of the biggest challenge on the horizon for sound design in VR – the economy of scale. When you move away from a simple scene with a few objects, to fully realized scenes with many different objects, everything in the scene has to be interactive, and that includes sound. You need to have variations and sensitivity.
This is one of the reasons why we recommend only having a few objects in a scene, and making the interactions for those objects as powerful as possible. As VR experiences grow in size and complexity, these new realities will need richer soundscapes than ever before.