Skip to content

Clay Brings Gesture Tracking To Apple's ARKit With No Extra Hardware

Clay Brings Gesture Tracking To Apple's ARKit With No Extra Hardware

One of the amazing things about Apple’s new ARKit developer platform is that there’s no extra hardware required. All of the amazing AR experiences we’ve seen running on iPhones are just using the kit’s camera, and that’s it. Clay is looking to do the same with gesture tracking.

Developers may already be familiar with Clay, it’s an SDK that allows smartphone apps to track the user’s hand in 3D with just the phone’s camera. It can recognize more than 30 gestures users make with their hands, allowing for controller-free navigation of experiences. In the past the company has showcased it as a way to interact with computers and control smartphone-based VR experiences. You can see those examples in action below.

https://youtu.be/Nqdsk4_COdU

After Apple introduced ARKit to the world last month the Clay team set about integrating functionality into the SDK, and the company tells us that it will be ready within the next week or two.

On paper, that means ARKit could get simple, accessible control options when it launches in full with iOS 11. Currently you can see a similar solution with Microsoft’s HoloLens, where users pinch their fingers to interact with virtual objects and interfaces in the real world. The different being that the iPhone isn’t an AR headset (at least not yet), so you’d probably have to hold the phone with one hand and then reach out and control with the other.

The SDK is already functional with the Unity engine, and support for Samsung’s Galaxy phones is also planned for the near future.

It sounds promising, though we’re yet to get our hands on with the platform and give it a try for ourselves.

Weekly Newsletter

See More