School of Mechatronic Engineering and Automation, Shanghai Key Laboratory of Manufacturing Automation and Robotics, Shanghai University, Shanghai 200072, China
Binocular stereo vision and 3D laser scanning techniques are the common sensor measurement methods in environment detection and modeling for mobile robots. In order to realize the data fusion application of the two systems, it is necessary to establish the mathematic relationship between the two local measurement coordinate systems, which is the position and pose joint calibration between the sensors. This paper proposes a new method for the joint calibration based on the distance matching among 3D feature points. A calibration board is designed, which is basically a blackwhite chess board with hollow holes. The binocular stereo vision cameras are used to extract the 3D coordinate information of the corner points of the chess board, and a laser ranging radar is used to scan and acquire the 3D coordinates of the centers of the hollow area. Through minimizing the squared difference of the theoretical and measured distances of two groups of the feature points, the rotation matrix and translation vector of the two sensor coordinate systems are obtained. The joint measurement experiment results show the accuracy and reliability of the proposed method.