The Universal Video Class (UVC) is the industry standard interface for cameras, so LeapUVC gives developers more control over properties such as LED brightness, gamma, exposure, gain, resolution, and more.

As far as AR is concerned, LeapUVC can enable apps to track objects affixed with ArUco markers as they move in space and map depth from stereo images.

"We hope this experimental release will open up entirely new use cases for the Leap Motion Controller in education, robotics, art, academic research, and more," the Leap Motion team stated in a blog post.

On Thursday, the company also introduced an experimental build that adds multiple device support to the Leap Motion controller, enabling more than one Leap Motion controller to run on a single Windows 64-bit computer with the CPU and USB horsepower to handle the extra load.

The build comes with an installer and sample Unity code to get developers started. While it does not support hand-tracking from one device to another, or tracking from multiple angles, Leap Motion notes that the initial experimental build is a step in that direction.

"Multiple device support has been a longstanding feature request in our developer community, and we're excited to share this experimental release with everyone," the Leap Motion team said in a blog post. "Multiple interactive spaces can be used for multiuser AR/VR, art installations, location-based entertainment, and more."

With its Leap Motion controller and its Project North Star open source headset design, Leap Motion continues to be one of the leading options for experimenting with gesture-based augmented reality experiences.

Even as more sophisticated AR headsets hit the market, the latest set of tools from Leap Motion may give developers even more opportunities to expand what's possible in the AR frontier.