Joint calibration of camera and lidar based on point cloud center
DOI:
Author:
Affiliation:

Clc Number:

V441TH744

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    To address the issue of sparse lidar point cloud and camera distortion caused by ambient light, an automatic registration method of lidar and camera based on point cloud center is proposed. The manual selection of feature points and continuous acquisition of multiple frames in traditional joint calibration can be avoided. After preprocessing the data of laser point clouds and camera image, the multichecker board point clouds are segmented automatically by the consistency of the plane normal vector. The point clouds of each checkerboard are extracted in the laser coordinate system and camera coordinate system, respectively. Then, the center points are solved iteratively by point cloud aggregation. The rough registration of the corresponding relationship between the center points of the two sensor checkerboards is realized. Finally, the iterative closest point algorithm is used for precision registration, and the calibration parameter matrix is obtained to complete the joint calibration. Measurement results show that the correct projection ratio of point cloud can reach 9793% within the range of lidar error ±3cm. This method can effectively obtain highprecision joint calibration parameters and meet the requirements of data fusion between lidar and camera in the space environment.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: April 19,2022
  • Published: