The World's First 3D Camera for Depth Sensing and Motion Tracking

Using advanced sensing technology based on human stereo vision, the ZED camera adds depth perception, positional tracking and 3D mapping to any application.

Depth perception at large distances

ZED perceives the world in three dimensions. Using binocular vision and high-resolution sensors, the camera can tell how far objects are around you from 0.5 to 20m at 100FPS, indoors and outdoors.

6-Axis Positional Tracking

Thanks to visual odometry technology, track camera motion in 3D space and get position and orientation at up to 100Hz with millimeter accuracy. No markers or external sensors needed. Supports Unity and ROS.

Large-scale 3D Mapping

With the ZED, capture a 3D map of your environment in seconds. The mesh can be used for real-time obstacle avoidance, visual effects or world-scale AR. Learn more.

How does the ZED work?

The ZED is a passive stereo camera that reproduces the way human vision works. Using its two “eyes”, the ZED creates a three-dimensional map of the scene by comparing the displacement of pixels between the left and right images.

How is the ZED different from other depth sensors?

Up until now, 3D sensors have been limited up to perceiving depth at short range and indoors. The ZED Stereo Camera is the first sensor to introduce indoor and outdoor long range depth perception along with 3D motion tracking capabilities, enabling new applications in many industries: AR/VR, drones, robotics, retail, visual effects and more.

What is the output of the ZED?

The ZED captures two synchronized left and right videos of a scene and outputs a full resolution side-by-side color video on USB 3.0. This color video is used by the ZED software on the host machine to create a depth map of the scene, track the camera position and build a 3D map of the area. Learn more.

Can I use the ZED for positional tracking?

Yes. Using stereo SLAM technology, the camera understands its position and orientation in space. 6DoF positional tracking is available through the ZED SDK and its plugins.

Can I do 3D scanning with the camera?

You can use the ZED to map large-scale environments with rough precision for applications such as autonomous robotics, AR/VR and more. We do not recommend using the ZED for object, body and architecture 3D scanning.

Is there an API available?

Yes! Developers can download the ZED SDK on our Developerpage. We also have a GitHub page that we update on a regular basis.

What third-party libraries do you support?

The ZED SDK and our GitHub page includes plugins and samples for Unity, OpenCV, ROS and more.

Can I use the ZED for people counting and tracking?

The ZED long depth range makes it a good fit for people counting and tracking. However, the camera will not provide a fine segmentation of a hand or a body, especially at long range.

Which OS and platforms are supported by the ZED SDK?

The ZED SDK is compatible with Windows and Linux. It can run on any PC that meets the minimum hardware requirements, and on embedded platforms from NVIDIA: Jetson TK1, TX1, TX2 and Xavier.

Which graphics cards are supported by the ZED SDK?

The ZED SDK requires an NVIDIA GPU with a compute capability > 2. If you don’t have an NVIDIA GPU, or you're using OSX or Android, you can still use the ZED to capture side by side 3D video but you will not be able to compute depth maps using the ZED SDK.

How do I extend the ZED USB 3.0 working distance?

You can use the ZED with USB 3.0 extension cables or optical fiber cables to extend the range of your camera up to 100m.

Do you have sample depth and color video of the ZED?

Did you check the video on our Product page? It's all shot with the ZED! You can also check our Developer page and YouTube channel for more videos.

Can I use multiple ZED at a time?

You can connect multiple ZED cameras to a single PC. This feature is supported on both Windows and Linux.

Can I use the ZED without the SDK?

Yes, the ZED camera is UVC (Universal Video Class) compliant so you can capture the left and right video streams of the ZED on Windows, Linux and OSX without the ZED SDK.

Is the depth map generated in the camera?

We want to provide constant updates and improvements to the ZED depth sensing capabilities. This is why the depth map is computed efficiently on the host machine connected to the ZED.

Will the ZED still work in USB2.0 mode?

The ZED is backward compatible with USB 2.0 at lower resolution and framerate (VGA mode only).