VI-SLAM algorithm with camera-IMU extrinsic automatic calibration and online estimation
DOI:
Author:
Affiliation:

Clc Number:

TP242.6TH76

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    The visualinertial simultaneous location and mapping (VISLAM) is mainly based on visual and inertial navigation information fusion. It is a tedious work to calibrate the cameraIMU extrinsic parameter offline. The tracking accuracy is affected when the mechanical configuration of the sensor suite changes slightly due to the impact or equipment adjustment. To solve this problem, one kind of VISLAM algorithm with automatic calibration and online estimation of the cameraIMU extrinsic parameters is proposed. In the algorithm, the first step is to estimate the cameraIMU extrinsic rotation with the handeye calibration and the gyroscope bias. Secondly, the scale factor, gravity and cameraIMU extrinsic translation are estimated without considering the accelerometer bias. Thirdly, these parameters are updated with the gravitational magnitude and accelerometer bias. Finally, the cameraIMU extrinsic parameters are put into the state vectors for online estimation. Experimental results using the EuRoC datasets show that the algorithm can automatically calibrate and estimate the cameraIMU extrinsic parameters. The errors of extrinsic orientation and the translation are within 05 degree and 002 meter, respectively. This can help improve the rapid utilization and accuracy of the VISLAM system.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: February 10,2022
  • Published: