Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.
Wählen Sie Textabschnitte aus um mit Künstlicher Intelligenz passenden Patente zu finden.
powered by
Markieren Sie Textabschnitte, um KI-gestützt weitere passende Inhalte zu finden.
powered by
Abstract
The aim of the paper is to describe the data-fusion from optical sensors for mobile robotics reconnaissance and mapping. Data are acquired by stereo pair of CCD cameras, stereo pair of thermal imagers, and TOF (time-of-flight) range camera.
The described calibration and data-fusion algorithms may be used for two purposes: visual telepresence (remote control) under extremely wide variety of visual conditions, like fog, smoke, darkness, etc., and for multispectral autonomous digital mapping of the robot’s environment.
The fusion is realized by means of spatial data from a TOF camera - the thermal and CCD camera data are comprised in one multispectral 3D model for mapping purposes or stereo image presented to a binocular, head-mounted display. The data acquisition is performed using a sensor head containing the mentioned 5 cameras, which is placed on 3 degrees-of-freedom (DOF) manipulator on Orpheus-X3 reconnaissance robot; both the head and the robot were developed by our working group.
Although the fusion is used for two different tasks – automatic environment mapping and visual telepresence, the utilized calibration and fusion algorithms are, in principle, the same.
Both geometrical calibration of each sensor, and the mutual positions of the sensors in 6-DOFs are calculated from calibration data acquired from newly developed multispectral calibration pattern. For the fusion the corresponding data from the CCD camera and the thermal imager are determined via homogeneous and perspective transformations. The result consists of image containing aligned data from the CCD camera and the thermal imager for each eye or a set of 3D points supplied by color and thermal information. Precision of data-fusion is determined both by calculation from mathematical model and experimental real-scenario evaluation. Precision of data-fusion and subsequently calibration is evaluated by real-environment measurements with help of newly developed multispectral targets.
Anzeige
Bitte loggen Sie sich ein, um Zugang zu Ihrer Lizenz zu erhalten.