Google announced today that the Depth API is now available for ARCore 1.18 on Android and Unity. The Depth API is meant to improve occlusion and increase realism thanks to new interaction types.
The Depth API was first announced with a preview on the Google developers blog last year. The API allows a device to determine the depth of objects shown on the camera, according to how far or close by they are. In terms of AR, the API helps to significantly improve occlusion, which Google succintly describes as “the ability for digital objects to accurately appear in front of or behind real world objects.”
The example embedded above shows the dancing hotdog filter on Snapchat being accurately occluded by a lounge as the camera moves down. According to Google, another case where the API would be useful is in Five Nights at Freddy’s AR: Special Delivery, as occlusion is vital to the experience — characters can accurately hide behind objects and then provide a jump scare by moving out from behind the real-world object. Niantic showed something similar with Pokemon Go in the past as well.
However, Occlusion is not the only use for the Depth API — Google notes that developers have found many other uses as well, including implementing more realistic physics, better surface interactions, and environmental traversal. For example, the Google Creative Lab experiment ‘Lines of Play’ allows users to build AR domino arrangements that will accurately collide with furniture and walls in the room when the dominoes knocked over.
The Depth API will begin rolling out today. You can read more over on the Google developers blog.