2016 | OriginalPaper | Chapter
Nearest Position Estimation Using Omnidirectional Images and Global Appearance Descriptors
Authors : Yerai Berenguer, Luis Payá, Adrián Peidró, Arturo Gil, Oscar Reinoso
Published in: Robot 2015: Second Iberian Robotics Conference
Publisher: Springer International Publishing
Activate our intelligent search to find suitable subject content or patents.
Select sections of text to find matching patents with Artificial Intelligence. powered by
Select sections of text to find additional relevant content using AI-assisted search. powered by
This work presents an algorithm to estimate the position and orientation of a mobile robot using only the visual information provided by a catadioptric system mounted on the robot. Each omnidirectional scene is described with a single global appearance descriptor. We have developed a description method which is based on the Radon transform. Our localization method compares the visual information captured by the robot from an unknown position with the visual information stored in a previously built map. As a result it estimates the nearest position of this map and the orientation of the robot. We have tested all the algorithms with a virtual database we have built. This database is composed of a set of omnidirectional images captured from different points of an indoor virtual environment. The experiments have allowed us to tune the main parameters and the results show the effectiveness and the robustness of our method.