Visual SLAM algorithm based on graph neural network feature point matching
DOI:
CSTR:
Author:
Affiliation:

Beijing Electro-mechanical Engineering Institute, Beijing 100074, China

Clc Number:

TP399TH701

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    The vision-based simultaneous localization and mapping (SLAM) technology has significant applications in industries, such as augmented reality and autonomous driving. However, traditional visual SLAM faces challenges such as low positioning accuracy or failure in low-light conditions. This article proposes a visual SLAM algorithm based on graph neural network (GNN) for matching feature points between consecutive frames, e.g., VINS-GNN. In the front end of the visual SLAM, a feature point matching and tracking strategy is designed, integrating GNN with visual SLAM, which could effectively enhance the performance of feature point tracking. In the back end, a loop closure algorithm based on multi-frame fusion is designed to further improve global positioning accuracy. Comparative experiments on public datasets with low light and low texture show that VINS-GNN improves positioning accuracy by 17.33% compared to VINS-Fusion. In real indoor low-light experiments, VINS-GNN significantly improves the accuracy at the end of the trajectory compared to VINS-Fusion. Additionally, the article introduces neural network inference acceleration techniques to reduce resource consumption and enhance real-time performance. Experimental results show that the strategies proposed by VINS-GNN significantly enhance positioning accuracy under indoor low-light conditions, which is of great significance for the development of indoor pedestrian and mobile robot positioning technology.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: December 19,2024
  • Published:
Article QR Code