At GDC earlier this year Valve’s Alex Vlachos revealed a set of techniques the company uses that could allow VR software to run on older, less expensive graphics cards. For developers, the technology represents a way to deliver software to a wider audience.
That technology is now in the hands of developers, released this week by Valve as part of The Lab Renderer. It includes the adaptive rendering technology to run VR scenes on a wider selection of graphics cards along with other tools that can aid Unity developers.
If developers apply these tools as updates to existing apps it’s possible Vive owners could see their VR experiences magically perform better on the same hardware.
“And if it performs better, the adaptive quality system will kick in and give customers a higher resolution rendering, so it will look better as well,” Vlachos wrote in an email.
The Lab is software bundled with the HTC Vive. Made by Valve, it represents one of the best introductions to highly interactive VR. It includes several mini game scenes ranging from a bow and arrow to a secret shop inhabited by magical creatures. Overall, it teaches VR first timers concepts like teleportation and how to use your hands. Most of The Lab was built in Unity, the creation engine used by a majority of VR developers, and The Lab Renderer is now available for free to developers using that toolset.
From a blog post about the release:
This isn’t a complete rendering solution that is meant to replace Unity’s extensive rendering features, but it does provide all of the features Valve used to ship The Lab. We are providing all source code and shader code so developers can modify it to best fit their needs.
Here are more of the included features for the technically-minded reader:
Single-Pass Forward Rendering and MSAA
Forward rendering is critical for VR, because it enables applications to use MSAA which is the. Deferred renderers suffer from aliasing issues due to the current state of the art in image-space anti-aliasing algorithms.
Unity’s default forward renderer is multi-pass where geometry is rendered an additional time for each runtime spotlight or point light that lights each object. The Lab’s renderer supports up to 18 dynamic, shadowing lights in a single forward pass.
The plugin requires that all materials use The Lab’s shaders. If you have a mixture of Unity shaders and The Lab shaders, Unity will render shadows multiple times, which will cost perf. There are menu options under Valve->ShaderDev to help you convert your existing materials to use The Lab’s shaders. We recommend backing up your project before running any of those helper commands.
We are shipping full shader source code, so you can customize the shaders to meet your product’s needs.
One of the least-known performance tricks in VR rendering is to flush the rendering API at a granularity that ensures the GPU is fed often enough to keep busy and avoid bubbles. This plugin calls Unity’s GL.Flush() in the main camera’s OnPostRender() and after shadow rendering for a total of up to 3 times per frame. This is critical for ensuring the GPU is receiving the draw calls that are being queued in the DirectX runtime in a timely manner.
Update: This post was updated with additional information from Alex Vlachos.