Skip to main content

2020 | OriginalPaper | Buchkapitel

The Method and Performance Analysis of Constructing Visual Point Cloud Map to Assist Inertial Positioning

verfasst von : Yayi Wang, Feng Zhu, Yanfen Shen, Xiaohong Zhang

Erschienen in: China Satellite Navigation Conference (CSNC) 2020 Proceedings: Volume III

Verlag: Springer Singapore

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

In the face of discontinuous GNSS signal and INS error accumulation in complex urban environment, the research on multi-source information fusion positioning method assisted by high-precision map is essential. In order to meet the real-time, continuous and reliable positioning requirements of vehicle navigation, this paper presents the principle and method of constructing high-precision map with stereo vision, and introduces the key technologies of map generation and data cleaning. Meanwhile, it analyzes the performance of the map-aided visual-inertial fusion positioning method. Firstly, the visual point cloud map is constructed based on mobile surveying. The local coordinates of road marking points are obtained by the front intersection of common-view pictures, then the camera pose solved from GNSS/SINS converted road marking points from local coordinate system to ECEF. However, the quantity of road marking points is large, and the absolute precision is difficult to evaluate. Therefore, this paper gives the quality indexes of evaluating point cloud map precision and data cleaning method. By using the method of deep learning to target recognition and semantic segmentation, only the long-term static stable road markings can be maintained in the map. In the meantime, a visual positioning method based on Octomap and DBOW acceleration is proposed. The KITTI dataset test shows that point cloud map is constructed and an average data cleaning rate is 35.05%, that many error observations are excluded. When it comes to positioning, the accuracy of visual/GNSS/INS integrated positioning yields about 2 cm and 0.06°. When integrated with INS, a positioning accuracy of 10 cm can be maintained with only one road landmark matched successfully in the case of GNSS long time (20 min) unlock.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Liu, J.: Progress and consideration of high precision road navigation map. Strateg. Study CAE 20(02), 99–105 (2018) Liu, J.: Progress and consideration of high precision road navigation map. Strateg. Study CAE 20(02), 99–105 (2018)
2.
Zurück zum Zitat Wang, J., Garratt, M., Lambert, A.: Integration of GPS/INS/vision sensors to navigate unmanned aerial vehicles. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 37(B1), 963–969 (2008) Wang, J., Garratt, M., Lambert, A.: Integration of GPS/INS/vision sensors to navigate unmanned aerial vehicles. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 37(B1), 963–969 (2008)
3.
Zurück zum Zitat Lategahn, H., et al.: Urban localization with camera and inertial measurement unit. IEEE (2013) Lategahn, H., et al.: Urban localization with camera and inertial measurement unit. IEEE (2013)
4.
Zurück zum Zitat Nedevschi, S., et al.: Accurate ego-vehicle global localization at intersections through alignment of visual data with digital map. IEEE Trans. Intell. Transp. Syst. 14(2), 673–687 (2013)CrossRef Nedevschi, S., et al.: Accurate ego-vehicle global localization at intersections through alignment of visual data with digital map. IEEE Trans. Intell. Transp. Syst. 14(2), 673–687 (2013)CrossRef
5.
Zurück zum Zitat Wu, T., Ranganathan, A.: Vehicle localization using road markings. IEEE (2013) Wu, T., Ranganathan, A.: Vehicle localization using road markings. IEEE (2013)
6.
Zurück zum Zitat Welzel, A., Reisdorf, P., Wanielik, G.: Improving urban vehicle localization with traffic sign recognition. IEEE (2015) Welzel, A., Reisdorf, P., Wanielik, G.: Improving urban vehicle localization with traffic sign recognition. IEEE (2015)
7.
Zurück zum Zitat Qu, X., Soheilian, B., Paparoditis, N.: Vehicle localization using mono-camera and geo-referenced traffic signs. IEEE (2015) Qu, X., Soheilian, B., Paparoditis, N.: Vehicle localization using mono-camera and geo-referenced traffic signs. IEEE (2015)
8.
Zurück zum Zitat Vivacqua, R.P.D., et al.: Self-localization based on visual lane marking maps: an accurate low-cost approach for autonomous driving. IEEE Trans. Intell. Transp. Syst. 19(2), 582–597 (2018)CrossRef Vivacqua, R.P.D., et al.: Self-localization based on visual lane marking maps: an accurate low-cost approach for autonomous driving. IEEE Trans. Intell. Transp. Syst. 19(2), 582–597 (2018)CrossRef
9.
Zurück zum Zitat Tao, Z., et al.: Mapping and localization using GPS, lane markings and proprioceptive sensors. In: Amato, N. (ed.) IEEE International Conference on Intelligent Robots and Systems, pp. 406–412 (2013) Tao, Z., et al.: Mapping and localization using GPS, lane markings and proprioceptive sensors. In: Amato, N. (ed.) IEEE International Conference on Intelligent Robots and Systems, pp. 406–412 (2013)
10.
Zurück zum Zitat Redmon, J., Farhadi, A.: YOLOv3: an incremental improvement (2018) Redmon, J., Farhadi, A.: YOLOv3: an incremental improvement (2018)
11.
Zurück zum Zitat Girshick, R.: Fast R-CNN. IEEE (2015) Girshick, R.: Fast R-CNN. IEEE (2015)
12.
Zurück zum Zitat Mur-Artal, R., Montiel, J.M.M., Tardos, J.D.: ORB-SLAM: a versatile and accurate monocular SLAM system. IEEE Trans. Rob. 31(5), 1147–1163 (2015)CrossRef Mur-Artal, R., Montiel, J.M.M., Tardos, J.D.: ORB-SLAM: a versatile and accurate monocular SLAM system. IEEE Trans. Rob. 31(5), 1147–1163 (2015)CrossRef
13.
Zurück zum Zitat Hornung, A., et al.: OctoMap: an efficient probabilistic 3D mapping framework based on octrees. Auton. Robot. 34(3), 189–206 (2013)CrossRef Hornung, A., et al.: OctoMap: an efficient probabilistic 3D mapping framework based on octrees. Auton. Robot. 34(3), 189–206 (2013)CrossRef
14.
Zurück zum Zitat Lepetit, V., Moreno-Noguer, F., Fua, P.: EPnP: an accurate O(n) solution to the PnP problem. Int. J. Comput. Vision 81(2), 155–166 (2009)CrossRef Lepetit, V., Moreno-Noguer, F., Fua, P.: EPnP: an accurate O(n) solution to the PnP problem. Int. J. Comput. Vision 81(2), 155–166 (2009)CrossRef
Metadaten
Titel
The Method and Performance Analysis of Constructing Visual Point Cloud Map to Assist Inertial Positioning
verfasst von
Yayi Wang
Feng Zhu
Yanfen Shen
Xiaohong Zhang
Copyright-Jahr
2020
Verlag
Springer Singapore
DOI
https://doi.org/10.1007/978-981-15-3715-8_45

Neuer Inhalt