Adobe previewed a concept it is working on that would make it easier for creators working on VR videos to place and align sound.
Producing high-quality 360-degree video content has traditionally been a difficult affair at all stages of production, from capture to delivery. However, a constant stream of new cameras, editing tools and streaming techniques are on the way to make the process easier.
With Adobe’s SonicScape, the tool visualizes the location of audio within a spherical VR video using colorful bubbles. The visualization makes it easy to click and drag the sound to align it with the video, which could be ideal for cases where a microphone’s audio isn’t synced to the same location as the picture. The tool could make it easy for creators to enhance how immersive the sound is in their 360-degree projects, so when you turn your head in different directions while wearing a head-mounted display the audio seems to come from the right spot. It’s also possible to place additional sound effects in specific spots within the sphere, which could make it easier for creators to layer in complex soundscapes in 360-degree projects pulling from a library of effects.
The concept was revealed at Adobe’s MAX conference in Las Vegas as part of the company’s forward-looking projects. Project SonicScape may or may not be incorporated into other Adobe tools. Last year, Adobe premiered tools in a similar concept format that were ideal for its Adobe Premiere Pro video editing suite, enabling creators to easily playback and preview 360-degree content while wearing a head-mounted display and using hand controls like Oculus Touch. Those tools just started shipping as part of the actual Adobe product line and if SonicScape improves the workflow for content creators it could be a similarly useful step forward for 360-degree video creators. If useful, SonicScape could find its way into a future iteration of Adobe’s products.
Check out the video below for a look at the software: