2013 | OriginalPaper | Chapter
Visual Gyroscope for Omnidirectional Cameras
Authors : Nicola Carlon, Emanuele Menegatti
Published in: Intelligent Autonomous Systems 12
Publisher: Springer Berlin Heidelberg
Activate our intelligent search to find suitable subject content or patents.
Select sections of text to find matching patents with Artificial Intelligence. powered by
Select sections of text to find additional relevant content using AI-assisted search. powered by (Link opens in a new window)
At present, algorithms for attitude estimation with omnidirectional cameras are predominantly environment-dependent. This constitutes a significant limitation to the applicability of such techniques. This study introduces an approach aimed at general mobile camera attitude estimation. The approach extracts features to directly estimate three-dimensional movements of a humanoid robot from its head-mounted camera. By doing so, it is not subject to the constraints of Structure from Motion with epipolar geometry, which are currently unattainable in real-time. The central idea is: movements between consecutive frames can be reliably estimated from the identity on the unit sphere between external parallel lines and projected great circles. After calibration, parallel lines match optical flow tracks. The point of infinity corresponds to the expansion focus of the movement. Simulations and experiments validate the ability to distinguish between translation, pure rotation, and roto-translation.