Skip to main content
Top
Published in: Fire Technology 6/2017

21-07-2017

Evidential Sensor Fusion of Long-Wavelength Infrared Stereo Vision and 3D-LIDAR for Rangefinding in Fire Environments

Authors: Joseph W. Starr, B. Y. Lattimer

Published in: Fire Technology | Issue 6/2017

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

A method of sensor fusion was developed to combine long-wavelength infrared (LWIR) stereo vision and a spinning LIDAR for improved rangefinding in smoke-obscured environments. This method allows rangefinding in clear and smoke conditions, relying on LIDAR’s high accuracy in clear conditions and the perception ability of LWIR cameras in smoke. Sensor data were combined using evidential (Dempster–Shafer) theory in a 3D multi-resolution voxel domain for occupied and free space states. A heuristic method was produced for separating significantly attenuated and low-attenuation LIDAR returns using return intensity and distance. A sensor model was developed to apply free space state information from LIDAR high-attenuation returns. Sensor models were developed for applying occupied and free space state information from LIDAR low-attenuation returns and from LWIR stereo vision points. The fusion method was evaluated in two fire environments: a room-hallway scenario with a range of clear to dense-smoke conditions and a shipboard fire scenario. Room-hallway tests were evaluated by assessing performance against baseline rangefinding. For the occupied state, the fusion method and LIDAR are within typically 5% to 10% for clear conditions, and the fusion method is more accurate than the LIDAR by typically 5% to 10% for smoke conditions, with LIDAR providing no data in the densest smoke. For the free space state, the fusion method outperformed the LIDAR in smoke conditions by as much as 40% and was typically within 5% of the LIDAR in clear conditions.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Starr JW, Lattimer BY (2014) Evaluation of navigation sensors in fire smoke environments. Fire Technol 50(6):1459–1481.CrossRef Starr JW, Lattimer BY (2014) Evaluation of navigation sensors in fire smoke environments. Fire Technol 50(6):1459–1481.CrossRef
2.
go back to reference Starr JW, Lattimer BY (2012) A comparison of IR stereo vision and LIDAR for use in fire environments. In: Sensors, 2012 IEEE. IEEE, pp 1–4. Starr JW, Lattimer BY (2012) A comparison of IR stereo vision and LIDAR for use in fire environments. In: Sensors, 2012 IEEE. IEEE, pp 1–4.
3.
go back to reference Starr JW, Lattimer BY (2013) Application of thermal infrared stereo vision in fire environments. In: 2013 IEEE/ASME international conference on advanced intelligent mechatronics (AIM). IEEE, pp 1675–1680. Starr JW, Lattimer BY (2013) Application of thermal infrared stereo vision in fire environments. In: 2013 IEEE/ASME international conference on advanced intelligent mechatronics (AIM). IEEE, pp 1675–1680.
4.
go back to reference Randhawa JS, Van der Laan JE (1980) Lidar observations during dusty infrared test-1. Appl Opt 19(14):2291–2297CrossRef Randhawa JS, Van der Laan JE (1980) Lidar observations during dusty infrared test-1. Appl Opt 19(14):2291–2297CrossRef
5.
go back to reference Sakai M, Aoki Y, Takagi M (2005) The support system of the firefighter’s activity by detecting objects in smoke space. In: Optomechatronic technologies 2005. International Society for Optics and Photonics, pp 60510L–60510L Sakai M, Aoki Y, Takagi M (2005) The support system of the firefighter’s activity by detecting objects in smoke space. In: Optomechatronic technologies 2005. International Society for Optics and Photonics, pp 60510L–60510L
6.
go back to reference Sakai M, Aoki Y (2006) Human and object detection in smoke-filled space using millimeter-wave radar based measurement system. In: 18th international conference on pattern recognition, 2006. ICPR 2006, vol 3. IEEE, pp 750–750). Sakai M, Aoki Y (2006) Human and object detection in smoke-filled space using millimeter-wave radar based measurement system. In: 18th international conference on pattern recognition, 2006. ICPR 2006, vol 3. IEEE, pp 750–750).
7.
go back to reference Yamauchi B (2010) All-weather perception for man-portable robots using ultra-wideband radar. In: 2010 IEEE international conference on robotics and automation (ICRA). IEEE, pp 3610–3615 Yamauchi B (2010) All-weather perception for man-portable robots using ultra-wideband radar. In: 2010 IEEE international conference on robotics and automation (ICRA). IEEE, pp 3610–3615
8.
go back to reference Yamauchi B (2008) All-weather perception for small autonomous UGVs. In: SPIE defense and security symposium. International Society for Optics and Photonics, pp 696203–696203. Yamauchi B (2008) All-weather perception for small autonomous UGVs. In: SPIE defense and security symposium. International Society for Optics and Photonics, pp 696203–696203.
9.
go back to reference Sales J, Marín R, Cervera E, Rodríguez S, Pérez J (2010) Multi-sensor person following in low-visibility scenarios. Sensors 10(12):10953–10966 Sales J, Marín R, Cervera E, Rodríguez S, Pérez J (2010) Multi-sensor person following in low-visibility scenarios. Sensors 10(12):10953–10966
10.
go back to reference Santos JM, Couceiro MS, Portugal D, Rocha RP (2015) A sensor fusion layer to cope with reduced visibility in SLAM. J Intell Robot Syst 80:1–22 Santos JM, Couceiro MS, Portugal D, Rocha RP (2015) A sensor fusion layer to cope with reduced visibility in SLAM. J Intell Robot Syst 80:1–22
11.
go back to reference Kim JH, Starr JW, Lattimer BY (2014) Firefighting robot stereo infrared vision and radar sensor fusion for imaging through smoke. Fire Technol 51:1–23 Kim JH, Starr JW, Lattimer BY (2014) Firefighting robot stereo infrared vision and radar sensor fusion for imaging through smoke. Fire Technol 51:1–23
12.
go back to reference Yamauchi B (2010) Fusing ultra-wideband radar and lidar for small UGV navigation in all-weather conditions. In: SPIE defense, security, and sensing. International Society for Optics and Photonics, pp 76920O–76920O Yamauchi B (2010) Fusing ultra-wideband radar and lidar for small UGV navigation in all-weather conditions. In: SPIE defense, security, and sensing. International Society for Optics and Photonics, pp 76920O–76920O
13.
go back to reference Khaleghi B, Khamis A, Karray FO, Razavi SN (2013) Multisensor data fusion: a review of the state-of-the-art. Inf Fusion 14(1):28–44 Khaleghi B, Khamis A, Karray FO, Razavi SN (2013) Multisensor data fusion: a review of the state-of-the-art. Inf Fusion 14(1):28–44
14.
go back to reference Dornfeld DA, DeVries MF (1990) Neural network sensor fusion for tool condition monitoring. CIRP Ann Manuf Technol 39(1):101–105 Dornfeld DA, DeVries MF (1990) Neural network sensor fusion for tool condition monitoring. CIRP Ann Manuf Technol 39(1):101–105
15.
go back to reference Ross, A., & Jain, A. (2003). Information fusion in biometrics. Pattern Recogn Lett 24(13):2115–2125 Ross, A., & Jain, A. (2003). Information fusion in biometrics. Pattern Recogn Lett 24(13):2115–2125
16.
go back to reference Pagac D, Nebot EM, Durrant-Whyte H (1998) An evidential approach to map-building for autonomous vehicles. Robot Autom IEEE Trans 14(4): 623–629 Pagac D, Nebot EM, Durrant-Whyte H (1998) An evidential approach to map-building for autonomous vehicles. Robot Autom IEEE Trans 14(4): 623–629
17.
go back to reference Tirumalai AP, Schunck BG, Jain RC (1995) Evidential reasoning for building environment maps. Syst Man Cybern IEEE Trans 25(1):10–20 Tirumalai AP, Schunck BG, Jain RC (1995) Evidential reasoning for building environment maps. Syst Man Cybern IEEE Trans 25(1):10–20
18.
go back to reference Shafer G (1976) A mathematical theory of evidence, vol 1. Princeton university press, Princeton Shafer G (1976) A mathematical theory of evidence, vol 1. Princeton university press, Princeton
19.
go back to reference Moravec HP (1988) Sensor fusion in certainty grids for mobile robots. AI Mag 9(2):61 Moravec HP (1988) Sensor fusion in certainty grids for mobile robots. AI Mag 9(2):61
21.
go back to reference Starr JW (2015) Rangefinding in fire smoke environments. Doctoral dissertation Starr JW (2015) Rangefinding in fire smoke environments. Doctoral dissertation
22.
go back to reference Rainieri S, Pagliarini G (2002) Data filtering applied to infrared thermographic measurements intended for the estimation of local heat transfer coefficient. Exp Thermal Fluid Sci 26(2):109–114 Rainieri S, Pagliarini G (2002) Data filtering applied to infrared thermographic measurements intended for the estimation of local heat transfer coefficient. Exp Thermal Fluid Sci 26(2):109–114
23.
go back to reference Nixon MS, Aguado AS (2012). Feature extraction & image processing for computer vision. Academic Press, Cambridge. Nixon MS, Aguado AS (2012). Feature extraction & image processing for computer vision. Academic Press, Cambridge.
24.
go back to reference Adarve JD, Perrollaz M, Makris A, Laugier C (2012) Computing occupancy grids from multiple sensors using linear opinion pools. In: 2012 IEEE International Conference on Robotics and Automation (ICRA). IEEE, pp 4074–4079 Adarve JD, Perrollaz M, Makris A, Laugier C (2012) Computing occupancy grids from multiple sensors using linear opinion pools. In: 2012 IEEE International Conference on Robotics and Automation (ICRA). IEEE, pp 4074–4079
25.
go back to reference Homm F, Kaempchen N, Ota J, Burschka D (2010) Efficient occupancy grid computation on the GPU with lidar and radar for road boundary detection. In: Intelligent Vehicles Symposium (IV), 2010 IEEE. IEEE, pp 1006–1013 Homm F, Kaempchen N, Ota J, Burschka D (2010) Efficient occupancy grid computation on the GPU with lidar and radar for road boundary detection. In: Intelligent Vehicles Symposium (IV), 2010 IEEE. IEEE, pp 1006–1013
26.
go back to reference Murray D, Little JJ (2000) Using real-time stereo vision for mobile robot navigation. Auton Robots 8(2):161–171 Murray D, Little JJ (2000) Using real-time stereo vision for mobile robot navigation. Auton Robots 8(2):161–171
Metadata
Title
Evidential Sensor Fusion of Long-Wavelength Infrared Stereo Vision and 3D-LIDAR for Rangefinding in Fire Environments
Authors
Joseph W. Starr
B. Y. Lattimer
Publication date
21-07-2017
Publisher
Springer US
Published in
Fire Technology / Issue 6/2017
Print ISSN: 0015-2684
Electronic ISSN: 1572-8099
DOI
https://doi.org/10.1007/s10694-017-0666-y

Other articles of this Issue 6/2017

Fire Technology 6/2017 Go to the issue