Skip to main content

2022 | OriginalPaper | Buchkapitel

An Object Tracking Using a Neuromorphic System Based on Standard RGB Cameras

verfasst von : E. B. Gouveia, L. M. Vasconcelos, E. L. S. Gouveia, V. T. Costa, A. Nakagawa-Silva, A. B. Soares

Erschienen in: XXVII Brazilian Congress on Biomedical Engineering

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Event-based cameras are devices that can be the key to solving various robotics challenges. However, unlike the Computer Vision field, research in Neuromorphic Vision does not have enough data for algorithms to be tested, evaluated, and compared to guarantee progress in the development of robust and competitive solutions. In this way, we propose the development of a framework that converts information recorded by standard RGB cameras into neuromorphic information. Using this framework, we created neuromorphic recordings from videos used in Computer Vision datasets to test and evaluate our tracking algorithm in neuromorphic recordings. We obtained an average accuracy of 95.38% in tracking the information of the five videos selected in this work.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Diamond A, Nowotny T, Schmuker M (2016) Comparing neuromorphic solutions in action: implementing a bio-inspired solution to a benchmark classification task on three parallel-computing platforms. Front Neurosci 9:491 Diamond A, Nowotny T, Schmuker M (2016) Comparing neuromorphic solutions in action: implementing a bio-inspired solution to a benchmark classification task on three parallel-computing platforms. Front Neurosci 9:491
2.
Zurück zum Zitat Osborn LE, Dragomir A, Betthauser JL et al (2018) Prosthesis with neuromorphic multilayered e-dermis perceives touch and pain. Sci Rob 3:eaat3818 Osborn LE, Dragomir A, Betthauser JL et al (2018) Prosthesis with neuromorphic multilayered e-dermis perceives touch and pain. Sci Rob 3:eaat3818
3.
Zurück zum Zitat Stefano B, Yannick B, Ilaria C et al (2019) A neuromorphic prosthesis to restore communication in neuronal networks. iScience. 19:402–414 Stefano B, Yannick B, Ilaria C et al (2019) A neuromorphic prosthesis to restore communication in neuronal networks. iScience. 19:402–414
4.
Zurück zum Zitat Delbrück T, Linares-Barranco Be, Culurciello E, Posch C (2010) Activity-driven, event-based vision sensors. In: Proceedings of 2010 IEEE international symposium on circuits and systems, pp. 426–2429. IEEE Delbrück T, Linares-Barranco Be, Culurciello E, Posch C (2010) Activity-driven, event-based vision sensors. In: Proceedings of 2010 IEEE international symposium on circuits and systems, pp. 426–2429. IEEE
5.
Zurück zum Zitat Brandli C, Berner R, Yang M, Liu S-C, Delbruck T (2014) A 240\(\times \) 180 130 db 3 \(\mu \)s latency global shutter spatiotemporal vision sensor. IEEE J Solid-State Circ 49:2333–2341 Brandli C, Berner R, Yang M, Liu S-C, Delbruck T (2014) A 240\(\times \) 180 130 db 3 \(\mu \)s latency global shutter spatiotemporal vision sensor. IEEE J Solid-State Circ 49:2333–2341
6.
Zurück zum Zitat Federico P-V, Scheper KYW, De Croon GCHE (2019) Unsupervised learning of a hierarchical spiking neural network for optical flow estimation: from events to global motion perception. IEEE Trans Pattern Anal Mach Intell Federico P-V, Scheper KYW, De Croon GCHE (2019) Unsupervised learning of a hierarchical spiking neural network for optical flow estimation: from events to global motion perception. IEEE Trans Pattern Anal Mach Intell
7.
Zurück zum Zitat Delbruck T (2018) Frame-free dynamic digital vision. In: Proceedings of symposium on secure-life electronics, advanced electronics for quality life and society, pp 21–26. Citeseerr Delbruck T (2018) Frame-free dynamic digital vision. In: Proceedings of symposium on secure-life electronics, advanced electronics for quality life and society, pp 21–26. Citeseerr
8.
Zurück zum Zitat Delbruck T, Lichtsteiner P (2007) Fast sensory motor control based on event-based hybrid neuromorphic-procedural system. In 2007 IEEE international symposium on circuits and systems, pp 845–848. IEEE Delbruck T, Lichtsteiner P (2007) Fast sensory motor control based on event-based hybrid neuromorphic-procedural system. In 2007 IEEE international symposium on circuits and systems, pp 845–848. IEEE
9.
Zurück zum Zitat Gallego G, Lund JEA, Mueggler E, Rebecq H, Delbruck T, Scaramuzza D (2017) Event-based, 6-DOF camera tracking from photometric depth maps. IEEE Trans Pattern Anal Mach Intell 40:2402–2412 Gallego G, Lund JEA, Mueggler E, Rebecq H, Delbruck T, Scaramuzza D (2017) Event-based, 6-DOF camera tracking from photometric depth maps. IEEE Trans Pattern Anal Mach Intell 40:2402–2412
10.
Zurück zum Zitat Ni Z, Ieng S-H, Posch C, Régnier S, Benosman R (2015) Visual tracking using neuromorphic asynchronous event-based cameras. Neural Comput 27:925–953 Ni Z, Ieng S-H, Posch C, Régnier S, Benosman R (2015) Visual tracking using neuromorphic asynchronous event-based cameras. Neural Comput 27:925–953
11.
Zurück zum Zitat Tan C, Lallee S, Orchard G (2015) Benchmarking neuromorphic vision: lessons learnt from computer vision. Front Neurosci 9:374 Tan C, Lallee S, Orchard G (2015) Benchmarking neuromorphic vision: lessons learnt from computer vision. Front Neurosci 9:374
12.
Zurück zum Zitat Mueggler E, Rebecq H, Gallego G, Delbruck T, Scaramuzza D The event-camera dataset: event-based data for pose estimation. Vis Odometry SLAM Mueggler E, Rebecq H, Gallego G, Delbruck T, Scaramuzza D The event-camera dataset: event-based data for pose estimation. Vis Odometry SLAM
13.
Zurück zum Zitat Dubuisson S, Gonzales C (2016) A survey of datasets for visual tracking. Mach Vis Appl 27:23–52 Dubuisson S, Gonzales C (2016) A survey of datasets for visual tracking. Mach Vis Appl 27:23–52
14.
Zurück zum Zitat About the VOT 2014 dataset at shorturl.at/bdvNZ About the VOT 2014 dataset at shorturl.at/bdvNZ
15.
Zurück zum Zitat Visual Tracker Benchmark at shorturl.at/rtAJ1 Visual Tracker Benchmark at shorturl.at/rtAJ1
16.
Zurück zum Zitat Santhoshkumar Sunderrajan (2015) Jagadeesh Vignesh. Manjunath BS, Robust multiple camera tracking with spatial and appearance contexts Santhoshkumar Sunderrajan (2015) Jagadeesh Vignesh. Manjunath BS, Robust multiple camera tracking with spatial and appearance contexts
17.
Zurück zum Zitat Santner J, Leistner C, Saffari A, Pock T, Bischof H (2010) PROST: parallel robust online simple tracking. In: 2010 IEEE computer society conference on computer vision and pattern recognition, pp 723–730. IEEE Santner J, Leistner C, Saffari A, Pock T, Bischof H (2010) PROST: parallel robust online simple tracking. In: 2010 IEEE computer society conference on computer vision and pattern recognition, pp 723–730. IEEE
18.
19.
Zurück zum Zitat Serra J (1983) Image analysis and mathematical morphology. Academic Serra J (1983) Image analysis and mathematical morphology. Academic
20.
Zurück zum Zitat Digabel H, Lantuéjoul C (1978) Iterative algorithms. In: Proceedings of 2nd European symposium quantitative analysis of microstructures in material science, biology and medicine, vol 19, p 8. Riederer Verlag, Stuttgart, West Germany Digabel H, Lantuéjoul C (1978) Iterative algorithms. In: Proceedings of 2nd European symposium quantitative analysis of microstructures in material science, biology and medicine, vol 19, p 8. Riederer Verlag, Stuttgart, West Germany
21.
Zurück zum Zitat Vincent L, Soille P (1991) Watersheds in digital spaces: an efficient algorithm based on immersion simulations. IEEE Trans Pattern Anal Mach Intell 583–598 Vincent L, Soille P (1991) Watersheds in digital spaces: an efficient algorithm based on immersion simulations. IEEE Trans Pattern Anal Mach Intell 583–598
22.
Zurück zum Zitat Preim B, Botha C (2014) Chapter 4—Image analysis for medical visualization. In: Preim B, Botha C (eds) Visual computing for medicine, 2nd edn, pp 111–175 Preim B, Botha C (2014) Chapter 4—Image analysis for medical visualization. In: Preim B, Botha C (eds) Visual computing for medicine, 2nd edn, pp 111–175
23.
Zurück zum Zitat Mangan A, Whitaker R (1999) Partitioning 3D surface meshes using watershed segmentation. IEEE Trans Vis Comput Graph 5:308–321 Mangan A, Whitaker R (1999) Partitioning 3D surface meshes using watershed segmentation. IEEE Trans Vis Comput Graph 5:308–321
24.
Zurück zum Zitat Nguyen H, Worring M, Van Den Boomgaard R (2003) Watersnakes: Energy-driven watershed segmentation. IEEE Trans Pattern Anal Mach Intelli 25:330–342 Nguyen H, Worring M, Van Den Boomgaard R (2003) Watersnakes: Energy-driven watershed segmentation. IEEE Trans Pattern Anal Mach Intelli 25:330–342
25.
Zurück zum Zitat Li J, Wei Luo, Wang Z, Fan S (2019) Early detection of decay on apples using hyperspectral reflectance imaging combining both principal component analysis and improved watershed segmentation method. Postharvest Biol Technol 149:235–246 Li J, Wei Luo, Wang Z, Fan S (2019) Early detection of decay on apples using hyperspectral reflectance imaging combining both principal component analysis and improved watershed segmentation method. Postharvest Biol Technol 149:235–246
26.
Zurück zum Zitat Rabbani A, Ayatollahi S, Kharrat R, Dashti N (2016) Estimation of 3-D pore network coordination number of rocks from watershed segmentation of a single 2-D image. Adv Water Resour 94:264–277 Rabbani A, Ayatollahi S, Kharrat R, Dashti N (2016) Estimation of 3-D pore network coordination number of rocks from watershed segmentation of a single 2-D image. Adv Water Resour 94:264–277
27.
Zurück zum Zitat Rogowska J (2000) Overview and fundamentals of medical image segmentation. Handb Med Imag Process Anal 69–85 Rogowska J (2000) Overview and fundamentals of medical image segmentation. Handb Med Imag Process Anal 69–85
28.
Zurück zum Zitat Hahn H, Peitgen H (2003) IWT-interactive watershed transform: a hierarchical method for efficient interactive and automated segmentation of multidimensional grayscale images. Med Imag Image Process Int Soc Opt Photonics 5032:643–653 Hahn H, Peitgen H (2003) IWT-interactive watershed transform: a hierarchical method for efficient interactive and automated segmentation of multidimensional grayscale images. Med Imag Image Process Int Soc Opt Photonics 5032:643–653
29.
Zurück zum Zitat Kuhnigk J-M, Hahn H, Hindennach M, Dicken V, Krass S, Peitgen H-O (2003) Lung lobe segmentation by anatomy-guided 3D watershed transform. Int Soc Optics Photonics Med Imag Image Process 5032:1482–1490 Kuhnigk J-M, Hahn H, Hindennach M, Dicken V, Krass S, Peitgen H-O (2003) Lung lobe segmentation by anatomy-guided 3D watershed transform. Int Soc Optics Photonics Med Imag Image Process 5032:1482–1490
30.
Zurück zum Zitat Orchard G, Jayawant A, Cohen GK, Thakor N (2015) Converting static image datasets to spiking neuromorphic datasets using saccades. Front Neurosci 9:437 Orchard G, Jayawant A, Cohen GK, Thakor N (2015) Converting static image datasets to spiking neuromorphic datasets using saccades. Front Neurosci 9:437
31.
Zurück zum Zitat Zhu A, Thakur D, Özaslan T, Pfrommer B, Kumar V, Daniilidis K (2018) The multivehicle stereo event camera dataset: an event camera dataset for 3D perception. IEEE Rob Autom Lett 3:2032–2039 Zhu A, Thakur D, Özaslan T, Pfrommer B, Kumar V, Daniilidis K (2018) The multivehicle stereo event camera dataset: an event camera dataset for 3D perception. IEEE Rob Autom Lett 3:2032–2039
32.
Zurück zum Zitat Mitrokhin A, Ye C, Fermuller C, Aloimonos Y, Delbruck T (2019) EV-IMO: motion segmentation dataset and learning pipeline for event cameras. arXiv preprint arXiv:1903.07520 Mitrokhin A, Ye C, Fermuller C, Aloimonos Y, Delbruck T (2019) EV-IMO: motion segmentation dataset and learning pipeline for event cameras. arXiv preprint arXiv:​1903.​07520
33.
Zurück zum Zitat Mitrokhin A, Fermüller C, Parameshwara C, Aloimonos Y (2018) Event-based moving object detection and tracking. In: 2018 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 1–9. IEEE Mitrokhin A, Fermüller C, Parameshwara C, Aloimonos Y (2018) Event-based moving object detection and tracking. In: 2018 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 1–9. IEEE
34.
Zurück zum Zitat Pérez-Carrasco JA, Zhao B, Serrano C et al (2013) Mapping from frame-driven to frame-free event-driven vision systems by low-rate rate coding and coincidence processing-application to feedforward ConvNets. IEEE Trans Pattern Anal Mach Intell 35:2706–2719 Pérez-Carrasco JA, Zhao B, Serrano C et al (2013) Mapping from frame-driven to frame-free event-driven vision systems by low-rate rate coding and coincidence processing-application to feedforward ConvNets. IEEE Trans Pattern Anal Mach Intell 35:2706–2719
35.
Zurück zum Zitat N-CARS at prophesee.ai/2018/03/13/dataset-n-cars/ N-CARS at prophesee.ai/2018/03/13/dataset-n-cars/
36.
Zurück zum Zitat Hu Y, Liu H, Pfeiffer M, Delbruck T (2016) DVS benchmark datasets for object tracking, action recognition, and object recognition. Front Neurosci 10:405 Hu Y, Liu H, Pfeiffer M, Delbruck T (2016) DVS benchmark datasets for object tracking, action recognition, and object recognition. Front Neurosci 10:405
37.
Zurück zum Zitat Katz ML, Nikolic K, Delbruck T (2012) Live demonstration: behavioural emulation of event-based vision sensors. In: 2012 IEEE international symposium on circuits and systems, pp 736–740. IEEE Katz ML, Nikolic K, Delbruck T (2012) Live demonstration: behavioural emulation of event-based vision sensors. In: 2012 IEEE international symposium on circuits and systems, pp 736–740. IEEE
38.
Zurück zum Zitat Bi Y, Andreopoulos Y (2017) PIX2NVS: parameterized conversion of pixel-domain video frames to neuromorphic vision streams. In: 2017 IEEE international conference on image processing (ICIP). IEEE, pp 1990–1994 Bi Y, Andreopoulos Y (2017) PIX2NVS: parameterized conversion of pixel-domain video frames to neuromorphic vision streams. In: 2017 IEEE international conference on image processing (ICIP). IEEE, pp 1990–1994
Metadaten
Titel
An Object Tracking Using a Neuromorphic System Based on Standard RGB Cameras
verfasst von
E. B. Gouveia
L. M. Vasconcelos
E. L. S. Gouveia
V. T. Costa
A. Nakagawa-Silva
A. B. Soares
Copyright-Jahr
2022
DOI
https://doi.org/10.1007/978-3-030-70601-2_333

Neuer Inhalt