Apple unveiled a new line of iPad Pros which include a LiDAR scanner and “new depth frameworks” to combine depth information from all the device’s sensors and cameras “for a more detailed understanding of a scene.”
The new iPads start at $800 and include the LiDAR scanner and two wide angle cameras, with the widest of the two offering a 125-degree field of view.
According to Apple, “Every existing ARKit app automatically gets instant AR placement, improved motion capture and people occlusion. Using the latest update to ARKit with a new Scene Geometry API, developers can harness the power of the new LiDAR Scanner to unleash scenarios never before possible.”
Mixed reality startup LIV recently released to testers a version of its camera app for iOS. With an A12 or newer processor the app automatically recognizes the background of the scene. This can be used to composite a player wearing a VR headset with content from their virtual without the need for a green screen. While the new iPad Pro features an A12Z Bionic chip, it is currently unknown whether or not it will work with an app like LIV.
Still, the new iPad Pro looks like it might be extremely useful relative to VR and AR. In 2018, Facebook showed an incredible tech demo at its OC5 developer conference that featured six people playing Dead and Buried on Oculus Quest at “arena” scale. A tablet was able to peer into the scene in real-time. Check it out here:
Facebook’s Oculus app already supports casting the view from an Oculus Quest to an iOS device. If Facebook could take advantage of the new depth information provided by this latest iPad, one day it might be possible to simply point the device at your friend wearing an Oculus Quest and peer into their virtual world.
Of course, Facebook has made no announcements about support for this kind of capture directly on an Apple device. We’ll provide updates as we hear whether developers are able to take advantage of the 3D-sensing capabilities of the new iPad Pro.