2016 | OriginalPaper | Buchkapitel
Nearest Position Estimation Using Omnidirectional Images and Global Appearance Descriptors
verfasst von : Yerai Berenguer, Luis Payá, Adrián Peidró, Arturo Gil, Oscar Reinoso
Erschienen in: Robot 2015: Second Iberian Robotics Conference
Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.
Wählen Sie Textabschnitte aus um mit Künstlicher Intelligenz passenden Patente zu finden. powered by
Markieren Sie Textabschnitte, um KI-gestützt weitere passende Inhalte zu finden. powered by
This work presents an algorithm to estimate the position and orientation of a mobile robot using only the visual information provided by a catadioptric system mounted on the robot. Each omnidirectional scene is described with a single global appearance descriptor. We have developed a description method which is based on the Radon transform. Our localization method compares the visual information captured by the robot from an unknown position with the visual information stored in a previously built map. As a result it estimates the nearest position of this map and the orientation of the robot. We have tested all the algorithms with a virtual database we have built. This database is composed of a set of omnidirectional images captured from different points of an indoor virtual environment. The experiments have allowed us to tune the main parameters and the results show the effectiveness and the robustness of our method.