Camera-LiDAR calibration is performed in two steps:
- Obtain camera intrinsics
- Obtain camera-LiDAR extrinsics
The intrinsics are obtained using the autoware_camera_calibration script, which is a fork of the official ROS calibration tool.
- In a sourced terminal:
rosrun autoware_camera_lidar_calibrator cameracalibrator.py --square SQUARE_SIZE --size MxN image:=/image_topic - Play a rosbag or stream from a camera in the selected topic name.
- Move the checkerboard around within the field of view of the camera until the bars turn green.
- Press the
CALIBRATEbutton. - The output and result of the calibration will be shown in the terminal.
- Press the
SAVEbutton. - A file will be saved in your home directory with the name
YYYYmmdd_HHMM_autoware_camera_calibration.yaml.
This file will contain the intrinsic calibration to rectify the image.
| Flag | Parameter | Type | Description |
|---|---|---|---|
| --square | SQUARE_SIZE |
double | Defines the size of the checkerboard square in meters. |
| --size | MxN |
string | Defines the layout size of the checkerboard (inner size). |
| image:= | image |
string | Topic name of the camera image source topic in raw format (color or b&w). |
| --min_samples | min_samples |
integer | Defines the minimum number of samples required to allow calibration. |
| --detection | engine |
string | Chessboard detection engine, default cv2 or matlab |
| For extra details please visit: http://www.ros.org/wiki/camera_calibration |
This node additionally supports the Matlab engine for chessboard detection, which is faster and more robust than the OpenCV implementation.
- Go to the Matlab python setup path
/PATH/TO/MATLAB/R201XY/extern/engines/python. - Run
python setup.py installto setup Matlab bindings.
To use this engine, add --detection matlab to the list of arguments, i.e.
rosrun autoware_camera_lidar_calibrator cameracalibrator.py --detection matlab --square SQUARE_SIZE --size MxN image:=/image_topic
Camera-LiDAR extrinsic calibration is performed by clicking on corresponding points in the image and the point cloud.
This node uses clicked_point and screenpoint from the rviz and image_view2 packages respectively.
- Perform the intrinsic camera calibration using camera intrinsic calibration tool described above (resulting in the file
YYYYmmdd_HHMM_autoware_camera_calibration.yaml). - In a sourced terminal:
roslaunch autoware_camera_lidar_calibrator camera_lidar_calibration.launch intrinsics_file:=/PATH/TO/YYYYmmdd_HHMM_autoware_camera_calibration.yaml image_src:=/image - An image viewer will be displayed.
- Open Rviz and show the point cloud and the correct fixed frame.
- Observe the image and the point cloud simultaneously.
- Find a point within the image that you can match to a corresponding point within the point cloud.
- Click on the pixel of the point in the image.
- Click on the corresponding 3D point in Rviz using the Publish Point tool.
- Repeat this with at least 9 different points.
- Once finished, a file will be saved in your home directory with the name
YYYYmmdd_HHMM_autoware_lidar_camera_calibration.yaml.
This file can be used with Autoware's Calibration Publisher to publish and register the transformation between the LiDAR and camera. The file contains both the intrinsic and extrinsic parameters.
| Parameter | Type | Description |
|---|---|---|
image_src |
string | Topic name of the camera image source topic. Default: /image_raw. |
camera_id |
string | If working with more than one camera, set this to the correct camera namespace, i.e. /camera0. |
intrinsics_file |
string | Topic name of the camera image source topic in raw format (color or b&w). |
compressed_stream |
bool | If set to true, a node to convert the image from a compressed stream to an uncompressed one will be launched. |
To test the calibration results, the generated yaml file can be used in the Calibration Publisher and then the Points Image in the Sensing tab.
This calibration tool assumes that the Velodyne is installed with the default order of axes for the Velodyne sensor.
- X axis points to the front
- Y axis points to the left
- Z axis points upwards

