Skip to content

Videostitch demonstrates live 360 3D video at SIGGRAPH

Videostitch demonstrates live 360 3D video at SIGGRAPH

There are a lot of people who are attacking immersive video for many different applications, but there is one thing that remains consistent – stitching is hard, and time consuming. Companies like Google and Jaunt are showing off solutions that do the stitching for you, in the cloud – but for someone looking for immediate results that is too slow. Then there is NextVR, who is doing plenty of amazing things with their solutions for 3D content with some advanced geometric work but their live streamed 3D content has been in the 180-degree format, not full 360. Enter Videostitch.

Videostitch is no newcomer to the livestreamed 360-degree video world, in fact it was their software that powered stitching for the timeline 360 videos that Facebook demonstrated at F8 this year – but adding 3D to their stack is something we haven’t seen before.

I had a chance to go hands on with their live stitched video at SIGGRAPH, and it was quite impressive. While the 3D didn’t match the depth results achieved by NextVR for example, it was quite impressive – as was the speed of the result. In previous demonstrations I’ve experienced there was about 2-3 seconds of  perceived latency between my movements and the video output, that has been significantly reduced. While still not 100% real time, the video plays close enough to it that it becomes much more difficult to see the latency which now sits at “about half a second delay” between the capture and the display, according to Videostitch CEO Nicolas Burtey.

Burtey continues however to say that a majority of this latency is due to the GoPros that they are currently using which “send the video over HDMI with a 300 millisecond delay.” The stitching is then done in “less than 100 milliseconds” and then there is a single frame delay from the Oculus.

This is all with local display only, when you take it up to broadcasting it does take a bit longer to go from capture at the scene to display on your couch, “about 30 seconds,” but that is to be expected as the information has to travel across long distances. This is something that happens with standard, non-VR broadcasts.

More important than speed is the quality of the stitch itself, which is pretty spectacular. During my hands on I noticed only a couple small stitching errors, but nothing experience breaking. For example when I turned and watched the crowd at SIGGRAPH as they walked past me I did see one person disappear and reappear through a seam but errors like this were few and far between.

Video is half the battle, audio is the other. Videostitch is currently working on a binaural audio solution to add to their technology stack, but they don’t yet have a set release date for that.

Burtey believes this technology will be most useful for directors and content creators initially, who rely on rapid feedback to quickly iterate on set. When you are spending more than $100,000 a day on a full scale production, counting the seconds and getting the most out of them is crucial and with live real time stitching you will be able to immediately get the feedback you need to direct the flow of a production.

We will continue to follow Videostitch’s progress, to learn more check out their website here.

 

Member Takes

Weekly Newsletter

See More