Stereolabs - ZED series cameras
ZED cameras by Stereolabs are designed to enhance computer vision applications by providing accurate depth information. They feature dual lenses that capture high-resolution images and generate precise depth maps. These cameras enable accurate object detection and recognition. Their simultaneous color and depth capabilities make them versatile for AI-driven tasks, such as autonomous navigation, robotics, and augmented reality. ZED cameras offer CUDA acceleration, leveraging the power of NVIDIA GPUs to accelerate depth computation and object detection, resulting in faster and more efficient processing of 3D vision tasks. Easy to integrate and widely used by professionals, ZED cameras are a go-to choice for those seeking advanced 3D vision and object detection capabilities.
This article is based on a official GitHub repository and Documentation. It is dedicated to all models of the ZED camera series: ZED, ZED2, ZED2i e.t.c.
Getting started
To run the device, you must meet certain requirements listed below:
Software:
Hardware:
- Graphics Card: NVIDIA GPU with Compute Capabilities > 6.1 or Jetson embedded GPU is required *
- Connectivity: USB 3.0
* You can use ZED Camera without an NVIDIA GPU, but the capabilities of the camera are limited.
Demo
In this demo, we'll walk you through using the ZED camera with ROS 2 via a Docker image. You'll also learn how to visualize data, including image previews and point clouds, using RViz. Demo based on zed-ros2-wrapper repository.
Tested with the ROSbot XL, NVIDIA Jetson Nano and ZED 2 camera.
Start guide
Simply plugin the camera into USB 3.0 port. Then use
lsusb
command to check if the device is visible.Select your appropriate image depend on your platform:
husarion/zed-desktop:humble
for desktop platform with CUDA (tested on platform with 11.7).husarion/zed-desktop-cpu:humble
a simple demo version on the CPU, allowing you to check if the camera and IMU are working properly. Image dedicated to platforms with amd64 and arm64 architecture.husarion/zed-jetson:foxy
for Jetson platform currently support - Jetson Xavier, Orin AGX/NX/Nano, CUDA 11.4 (tested on Xavier AGX).
- Before you started it is necessary to setup few variables appropriate to you configuration.
export ZED_IMAGE=<zed_image>
export CAMERA_MODEL=<camera_model>
- Clone repository.
git clone https://github.com/husarion/zed-docker.git
- Pulling the Docker images.
cd zed-docker/demo
docker compose pull
- Run compose.
xhost local:root
docker compose up
- Configure RViz.
Add following topic to RViz:
- RGB image topic:
/<camera_model>/zed_node/rgb/image_rect_color
- Point cloud topic:
/<camera_model>/zed_node/point_cloud/cloud_registered
Result
Now you should see a point cloud that defines the position of objects in 3d space, and when you move the camera you should also be able to observe the trajectories of the camera movement.
ROS API
The full API of the robot can be found in the official documentation of the camera.
Summary
The depth camera is increasingly the basis of many modern robotic projects. ZED cameras are one of the first depth cameras that use CUDA cores, which allows for such high accuracy while maintaining a large number of frames per second. In addition, ZED officially supports ROS 2 and provides more and more effective solutions.
Do you need any support with completing this project or have any difficulties with software or hardware? Feel free to describe your thoughts on our community forum: https://community.husarion.com/ or to contact with our support: support@husarion.com