Tightly-coupled SLAM method for unmanned ground vehicles based on factor graph optimization with multi-sensor information constraints
DOI:
CSTR:
Author:
Affiliation:

1.School of Automation, Harbin University of Science and Technology, Harbin 150080, China; 2.Shandong Bim Information Technology Co., Ltd., Weihai 264200, China

Clc Number:

TH85TH242

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Multi-sensor simulataneous localization and mapping (SLAM) mitigates single-sensor limitations, yet current methods still face challenges such as monocular scale ambiguity, inaccurate intrtial measurement unit (IMU) initialization, and limited local mapping precision. This paper proposes a tightly-coupled, factor graph-based SLAM approach that fuses data from three heterogeneous sensors: a 3D light detection and ranging (LiDAR), an IMU, and a camera. For initialization, LiDAR data provides depth for visual features, and outliers are removed through neighborhood selection and statistical optimization to improve accuracy. Visual, LiDAR, and IMU data are then fused to jointly estimate IMU biases and gravity direction, reducing vertical map drift. For local optimization, factor graphs dynamically maintain keyframes and local maps within sliding windows. Visual constraints are refined through co-visibility projection matching, efficiently purging redundant map points while boosting accuracy and robustness. Global optimization incorporates loop-closure factors detected via specialized algorithms and applies incremental optimization to the factor graph, suppressing cumulative error without compromising real-time performance. The proposed method is evaluated on KITTI, M2UD extreme weather, and real-campus datasets. It reduces the absolute trajectory error by 53.1% on KITTI, 66% in M2UD rain/snow scenarios, and 20.3% in campus environments compared to LIO-SAM. The resulting maps exhibit higher structural consistency and geometric accuracy in both overhead and side views.

    Reference
    Related
    Cited by
Get Citation
Related Videos

Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: January 13,2026
  • Published:
Article QR Code