Abstract:Extrinsic calibration is a key prerequisite for the data fusion of LiDAR and camera. However, the current calibration methods still face several challenges such as dependence on prior conditions, single feature constraints, and low calibration accuracy. To address these issues, this paper proposes a two-stage automatic extrinsic calibration method that integrates the mutual information and multifeature constraints. It combines the advantages of mutual information and multi-feature constraints methods. Firstly, by constructing a mutual information maximization model, the first stage is the coarse extrinsic calibration based on mutual information. This stage obtains the initial extrinsic calibration parameters according to the correlation between LiDAR reflectance and camera grayscale values, not depending on the initial values, set values, or any other prior conditions. Additionally, we design an adaptive gradient algorithm to refine the initial values of the extrinsic parameters. Secondly, the following stage involves the fine calibration of extrinsic parameters with multi-feature constraints, which uses the multiply constraints including the point-to-line, point-to-plane and line-to-plane, to optimize the initial extrinsic parameters obtained from the first stage. Also, the iterative closest point (ICP) algorithm is utilized to minimize the reprojection error between the 3D geometric features of the point cloud and the 2D geometric features of the image. Finally, we conducted the extrinsic calibration experiments in both indoor and outdoor challenging environments using a special-designed hollow circular calibration board, which simultaneously possesses the multi-feature constraints: point, line, and plane. The experimental results proved that the proposed calibration method can automatically and precisely achieve the extrinsic parameters of LiDAR and camera not depending on initial values. Additionally, the method exhibits higher accuracy and stability.