Skip to main content
Erschienen in: 3D Research 2/2019

01.06.2019 | 3DR Express

Nonparametric Statistical and Clustering Based RGB-D Dense Visual Odometry in a Dynamic Environment

verfasst von: Wugen Zhou, Xiaodong Peng, Haijiao Wang, Bo Liu

Erschienen in: 3D Research | Ausgabe 2/2019

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

The robustness of dense-visual-odometry is still a challenging problem if moving objects appear in the scene. In this paper, we propose a form of dense-visual-odometry to handle a highly dynamic environment by using RGB-D data. Firstly, to find dynamic objects, we propose a multi-frame based residual computing model, which takes a far time difference frame into consideration to achieve the temporal consistency motion segmentation. Then the proposed method combines a scene clustering model and a nonparametric statistical model to obtain weighted cluster-wise residuals, as the weight describes how importantly a cluster residual is considered. Afterward, the motion segmentation labeling and clusters’ weights are added to the energy function optimization of dense-visual-odometry to reduce the influence of moving objects. Finally, the experimental results demonstrate that the proposed method has better performance than the state-of-the-art methods on many challenging sequences from a benchmark dataset, especially on highly dynamic sequences.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Anhänge
Nur mit Berechtigung zugänglich
Literatur
1.
Zurück zum Zitat Yousif, K., Bab-Hadiashar, A., & Hoseinnezhad, R. (2015). An overview to visual odometry and visual SLAM: Applications to mobile robotics. Intelligent Industrial Systems, 1(4), 289–311.CrossRef Yousif, K., Bab-Hadiashar, A., & Hoseinnezhad, R. (2015). An overview to visual odometry and visual SLAM: Applications to mobile robotics. Intelligent Industrial Systems, 1(4), 289–311.CrossRef
2.
Zurück zum Zitat Fioraio N & Stefano LD (2013) Joint detection, tracking and mapping by semantic bundle adjustment. In IEEE conference on computer vision and pattern recognition (pp. 1538–1545). Fioraio N & Stefano LD (2013) Joint detection, tracking and mapping by semantic bundle adjustment. In IEEE conference on computer vision and pattern recognition (pp. 1538–1545).
3.
Zurück zum Zitat Reddy ND, Singhal P et al. (2015) Dynamic body VSLAM with semantic constraints. In IEEE/RSJ international conference on intelligent robots and systems (pp. 1897–1904). Reddy ND, Singhal P et al. (2015) Dynamic body VSLAM with semantic constraints. In IEEE/RSJ international conference on intelligent robots and systems (pp. 1897–1904).
4.
Zurück zum Zitat Mousavian A, Kosecka J & Lien JM (2015) Semantically guided location recognition for outdoors scenes. In IEEE international conference on robotics and automation (pp. 4882–4889). Mousavian A, Kosecka J & Lien JM (2015) Semantically guided location recognition for outdoors scenes. In IEEE international conference on robotics and automation (pp. 4882–4889).
5.
Zurück zum Zitat Klein G & Murray D (2007) Parallel tracking and mapping for small AR workspaces. In IEEE and ACM international symposium on mixed and augmented reality (pp. 1–10). Klein G & Murray D (2007) Parallel tracking and mapping for small AR workspaces. In IEEE and ACM international symposium on mixed and augmented reality (pp. 1–10).
6.
Zurück zum Zitat Mur-Artal, R., Montiel, J. M. M., & Tardós, J. D. (2015). ORB-SLAM: A versatile and accurate monocular SLAM system. IEEE Transactions on Robotics, 31(5), 1147–1163.CrossRef Mur-Artal, R., Montiel, J. M. M., & Tardós, J. D. (2015). ORB-SLAM: A versatile and accurate monocular SLAM system. IEEE Transactions on Robotics, 31(5), 1147–1163.CrossRef
7.
Zurück zum Zitat Davison, A. J., Reid, I. D., Molton, N. D., & Stasse, O. (2007). MonoSLAM: Real-time single camera SLAM. IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(6), 1052–1067.CrossRef Davison, A. J., Reid, I. D., Molton, N. D., & Stasse, O. (2007). MonoSLAM: Real-time single camera SLAM. IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(6), 1052–1067.CrossRef
8.
Zurück zum Zitat Newcombe RA, Lovegrove SJ & Davison AJ (2011) DTAM: Dense tracking and mapping in real-time. In IEEE international conference on computer vision (pp. 2320–2327). Newcombe RA, Lovegrove SJ & Davison AJ (2011) DTAM: Dense tracking and mapping in real-time. In IEEE international conference on computer vision (pp. 2320–2327).
9.
Zurück zum Zitat Kerl C, Sturm J & Cremers D (2014) Dense visual SLAM for RGB-D cameras. In IEEE/RSJ international conference on intelligent robots and systems (pp. 2100–2106). Kerl C, Sturm J & Cremers D (2014) Dense visual SLAM for RGB-D cameras. In IEEE/RSJ international conference on intelligent robots and systems (pp. 2100–2106).
10.
Zurück zum Zitat Steinbrücker F, Sturm J & Cremers D (2011) Real-time visual odometry from dense RGB-D images. In IEEE international conference on computer vision workshops (pp. 719–722). Steinbrücker F, Sturm J & Cremers D (2011) Real-time visual odometry from dense RGB-D images. In IEEE international conference on computer vision workshops (pp. 719–722).
11.
Zurück zum Zitat Engel J, Schöps T & Cremers D (2014) LSD-SLAM: Large-scale direct monocular SLAM. In European conference on computer vision (pp. 834–849). Engel J, Schöps T & Cremers D (2014) LSD-SLAM: Large-scale direct monocular SLAM. In European conference on computer vision (pp. 834–849).
12.
Zurück zum Zitat Taketomi, T., Uchiyama, H., & Ikeda, S. (2017). Visual SLAM algorithms: A survey from 2010 to 2016. IPSJ Transactions on Computer Vision and Applications, 9(1), 16.CrossRef Taketomi, T., Uchiyama, H., & Ikeda, S. (2017). Visual SLAM algorithms: A survey from 2010 to 2016. IPSJ Transactions on Computer Vision and Applications, 9(1), 16.CrossRef
13.
Zurück zum Zitat Wang Y & Huang S (2014) Motion segmentation based robust RGB-D SLAM. In World congress on intelligent control and automation (pp. 3122–3127). Wang Y & Huang S (2014) Motion segmentation based robust RGB-D SLAM. In World congress on intelligent control and automation (pp. 3122–3127).
14.
Zurück zum Zitat Whelan T, Johannsson H et al. (2013) Robust real-time visual odometry for dense RGB-D mapping. In IEEE international conference on robotics and automation (pp. 5724–5731). Whelan T, Johannsson H et al. (2013) Robust real-time visual odometry for dense RGB-D mapping. In IEEE international conference on robotics and automation (pp. 5724–5731).
15.
Zurück zum Zitat Jaimez M, Kerl C, Gonzalez-Jimenez J & Cremers D (2017) Fast odometry and scene flow from RGB-D cameras based on geometric clustering. In IEEE international conference on robotics and automation (pp. 3992–3999). Jaimez M, Kerl C, Gonzalez-Jimenez J & Cremers D (2017) Fast odometry and scene flow from RGB-D cameras based on geometric clustering. In IEEE international conference on robotics and automation (pp. 3992–3999).
16.
Zurück zum Zitat Kim, D. H., & Kim, J. H. (2017). Effective background model-based RGB-D dense visual odometry in a dynamic environment. IEEE Transactions on Robotics, 32(6), 1565–1573.CrossRef Kim, D. H., & Kim, J. H. (2017). Effective background model-based RGB-D dense visual odometry in a dynamic environment. IEEE Transactions on Robotics, 32(6), 1565–1573.CrossRef
17.
Zurück zum Zitat Kundu A, Krishna KM & Jawahar CV (2011) Realtime multibody visual SLAM with a smoothly moving monocular camera. In IEEE international conference on computer vision (pp. 2080–2087). Kundu A, Krishna KM & Jawahar CV (2011) Realtime multibody visual SLAM with a smoothly moving monocular camera. In IEEE international conference on computer vision (pp. 2080–2087).
18.
Zurück zum Zitat Tan W, Liu H et al. (2013) Robust monocular SLAM in dynamic environments. In IEEE international symposium on mixed and augmented reality (pp. 209–218). Tan W, Liu H et al. (2013) Robust monocular SLAM in dynamic environments. In IEEE international symposium on mixed and augmented reality (pp. 209–218).
19.
Zurück zum Zitat Alcantarilla PF, Yebes JJ, Almazán J & Bergasa LM (2012) On combining visual SLAM and dense scene flow to increase the robustness of localization and mapping in dynamic environments. In IEEE international conference on robotics and automation (pp. 1290–1297). Alcantarilla PF, Yebes JJ, Almazán J & Bergasa LM (2012) On combining visual SLAM and dense scene flow to increase the robustness of localization and mapping in dynamic environments. In IEEE international conference on robotics and automation (pp. 1290–1297).
20.
Zurück zum Zitat Zou, D., & Tan, P. (2013). CoSLAM: Collaborative visual SLAM in dynamic environments. IEEE Transactions on Pattern Analysis and Machine Intelligence, 35(2), 354–366.CrossRef Zou, D., & Tan, P. (2013). CoSLAM: Collaborative visual SLAM in dynamic environments. IEEE Transactions on Pattern Analysis and Machine Intelligence, 35(2), 354–366.CrossRef
21.
Zurück zum Zitat Leung TS & Medioni G (2014) Visual navigation aid for the blind in dynamic environments. In IEEE conference on computer vision and pattern recognition workshops (pp. 579–586). Leung TS & Medioni G (2014) Visual navigation aid for the blind in dynamic environments. In IEEE conference on computer vision and pattern recognition workshops (pp. 579–586).
22.
Zurück zum Zitat Kim DH, Han SB & Kim JH (2015) Visual odometry algorithm using an RGB-D sensor and IMU in a highly dynamic environment. In Robot intelligence technology and applications 3, advances in intelligent systems and computing, (vol. 345, pp. 11–26). Kim DH, Han SB & Kim JH (2015) Visual odometry algorithm using an RGB-D sensor and IMU in a highly dynamic environment. In Robot intelligence technology and applications 3, advances in intelligent systems and computing, (vol. 345, pp. 11–26).
23.
Zurück zum Zitat Li, S., & Lee, D. (2017). RGB-D SLAM in dynamic environments using static point weighting. IEEE Robotics & Automation Letters, 2(4), 2263–2270.CrossRef Li, S., & Lee, D. (2017). RGB-D SLAM in dynamic environments using static point weighting. IEEE Robotics & Automation Letters, 2(4), 2263–2270.CrossRef
24.
Zurück zum Zitat Scona R, Jaimez M et al. (2018) StaticFusion: Background reconstruction for dense RGB-D SLAM in dynamic environments. In IEEE international conference on robotics and automation (pp. 3849–3856). Scona R, Jaimez M et al. (2018) StaticFusion: Background reconstruction for dense RGB-D SLAM in dynamic environments. In IEEE international conference on robotics and automation (pp. 3849–3856).
25.
Zurück zum Zitat Sun, Y., Liu, M., & Meng, Q. H. (2017). Improving RGB-D SLAM in dynamic environments: A motion removal approach. Robotics & Autonomous Systems, 89, 110–122.CrossRef Sun, Y., Liu, M., & Meng, Q. H. (2017). Improving RGB-D SLAM in dynamic environments: A motion removal approach. Robotics & Autonomous Systems, 89, 110–122.CrossRef
26.
Zurück zum Zitat Roussos A, Russell C, Garg R & Agapito L (2012) Dense multibody motion estimation and reconstruction from a handheld camera. In IEEE international symposium on mixed and augmented reality (pp. 31–40). Roussos A, Russell C, Garg R & Agapito L (2012) Dense multibody motion estimation and reconstruction from a handheld camera. In IEEE international symposium on mixed and augmented reality (pp. 31–40).
27.
Zurück zum Zitat Wang Y & Huang S (2014) Towards dense moving object segmentation based robust dense RGB-D SLAM in dynamic scenarios. In International conference on control automation robotics & vision (pp. 1841–1846). Wang Y & Huang S (2014) Towards dense moving object segmentation based robust dense RGB-D SLAM in dynamic scenarios. In International conference on control automation robotics & vision (pp. 1841–1846).
28.
Zurück zum Zitat Stückler, J., & Behnke, S. (2015). Efficient dense rigid-body motion segmentation and estimation in RGB-D video. International Journal of Computer Vision, 113(3), 233–245.MathSciNetCrossRef Stückler, J., & Behnke, S. (2015). Efficient dense rigid-body motion segmentation and estimation in RGB-D video. International Journal of Computer Vision, 113(3), 233–245.MathSciNetCrossRef
29.
Zurück zum Zitat Ranjan A, Jampani V et al. (2018) Adversarial collaboration: Joint unsupervised learning of depth, camera motion, optical flow and motion segmentation. arXiv preprint arXiv:1805.09806. Ranjan A, Jampani V et al. (2018) Adversarial collaboration: Joint unsupervised learning of depth, camera motion, optical flow and motion segmentation. arXiv preprint arXiv:​1805.​09806.
30.
Zurück zum Zitat Kruger, B., Vogele, A., et al. (2017). Efficient unsupervised temporal segmentation of motion data. IEEE Transactions on Multimedia, 19(4), 797–812.CrossRef Kruger, B., Vogele, A., et al. (2017). Efficient unsupervised temporal segmentation of motion data. IEEE Transactions on Multimedia, 19(4), 797–812.CrossRef
31.
Zurück zum Zitat Ho, H. W., Wagter, C. D., Remes, B. D. W., & Croon, G. C. H. E. D. (2015). Optical-flow based self-supervised learning of obstacle appearance applied to MAV landing. Robotics & Autonomous Systems, 100, 78–94.CrossRef Ho, H. W., Wagter, C. D., Remes, B. D. W., & Croon, G. C. H. E. D. (2015). Optical-flow based self-supervised learning of obstacle appearance applied to MAV landing. Robotics & Autonomous Systems, 100, 78–94.CrossRef
32.
Zurück zum Zitat Sturm J, Engelhard N et al. (2012) A benchmark for the evaluation of RGB-D SLAM systems. In IEEE/RSJ international conference on intelligent robots and systems (pp. 573–580). Sturm J, Engelhard N et al. (2012) A benchmark for the evaluation of RGB-D SLAM systems. In IEEE/RSJ international conference on intelligent robots and systems (pp. 573–580).
Metadaten
Titel
Nonparametric Statistical and Clustering Based RGB-D Dense Visual Odometry in a Dynamic Environment
verfasst von
Wugen Zhou
Xiaodong Peng
Haijiao Wang
Bo Liu
Publikationsdatum
01.06.2019
Verlag
3D Display Research Center
Erschienen in
3D Research / Ausgabe 2/2019
Elektronische ISSN: 2092-6731
DOI
https://doi.org/10.1007/s13319-019-0220-4

Weitere Artikel der Ausgabe 2/2019

3D Research 2/2019 Zur Ausgabe