An anchor-guided 3D target detection algorithm based on stereo RCNN
DOI:
Author:
Affiliation:

Clc Number:

TP391. 41 TH741

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    The current binocular 3D detection algorithm has the problem of slow online calculation speed due to a large number of anchor points to be selected. To address this issue, an anchor-guided 3D target detection algorithm is proposed, which is based on the stereo RCNN. This method is named as the FGAS RCNN. In the first stage, a probability map is generated for the left and right input images to generate sparse anchor points and corresponding sparse anchor boxes. The left and right anchors are used as the whole entirety to generate a 2D preselection box. The second stage is based on the key-point generation network of the pyramid feature network. The key-point heatmaps are generated by the information of these sparse anchor points. A 3D bounding box can be generated by combining the stereo regressor with these key-point heatmaps. The original image will lose pixel-level information after convolution. The instance segmentation mask generated by Mask Branch can be used to solve this problem. The 3D bounding box center depth precision can be improved by the instance segmentation mask and the instance-level disparity estimation. Experimental results show that the proposed method can reduce the amount of calculation while maintaining a high recall rate without any depth and position prior information input. Specifically, the mean average precision is 44. 07% on 3D target detection with a threshold of 0. 7. Compared with the stereo RCNN, the proposed method improves the average precision by 4. 5% . Meanwhile, the overall running time of our method is 0. 09 s shorter than Stereo RCNN.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: June 28,2023
  • Published: