Introduction

The ZED is a camera that reproduces the way human vision works. Using its two “eyes” and through triangulation, the ZED provides a three-dimensional understanding of the scene it observes, allowing your application to become space and motion aware.

This guide will show you how to get started. The best way to use the guide is:

Depth Perception

Depth perception is the ability to determine distances between objects and see the world in three dimensions. Up until now, depth sensors have been limited to perceiving depth at short range and indoors, restricting their application to gesture control and body tracking. Using stereo vision, the ZED is the first universal depth sensor:

Depth can be captured at longer ranges, up to 20m (Extended to 30m in ULTRA mode).

Frame rate of depth capture can be as high as 100 FPS.

Field of view is much larger, up to 90° (H) x 60° (V).

The camera works indoors and outdoors, contrary to active sensors such as structured-light or time of flight.

Positional Tracking

Using computer vision and stereo SLAM technology, the ZED also understands its position and orientation in space, offering full 6DOF positional tracking.

In VR/AR, this means you can now walk around freely and the camera will track your movements anywhere. If you’re into robotics, you can now reliably determine your robot’s position, orientation, and velocity and make it navigate autonomously to the coordinates of your choice on a map. You can access 6DOF motion tracking data through the ZED SDK or its plugins: Unity, ROS…

Spatial Mapping

Spatial mapping is the ability to capture a digital model of a scene or an object in the physical world. By merging the real world with the virtual world, it is possible to create convincing mixed reality experiences or robots that understand their environment.

The ZED continuously scans its environment to reconstruct a 3D map of the real-world. It refines its understanding of the world by combining new depth and position data over time. Spatial mapping is available either through the ZEDfu application or the ZED SDK.