Skip to main content
main-content
Top

Hint

Swipe to navigate through the chapters of this book

2013 | OriginalPaper | Chapter

Visual Gyroscope for Omnidirectional Cameras

Authors: Nicola Carlon, Emanuele Menegatti

Published in: Intelligent Autonomous Systems 12

Publisher: Springer Berlin Heidelberg

share
SHARE

At present, algorithms for attitude estimation with omnidirectional cameras are predominantly environment-dependent. This constitutes a significant limitation to the applicability of such techniques. This study introduces an approach aimed at general mobile camera attitude estimation. The approach extracts features to directly estimate three-dimensional movements of a humanoid robot from its head-mounted camera. By doing so, it is not subject to the constraints of Structure from Motion with epipolar geometry, which are currently unattainable in real-time. The central idea is: movements between consecutive frames can be reliably estimated from the identity on the unit sphere between external parallel lines and projected great circles. After calibration, parallel lines match optical flow tracks. The point of infinity corresponds to the expansion focus of the movement. Simulations and experiments validate the ability to distinguish between translation, pure rotation, and roto-translation.

Metadata
Title
Visual Gyroscope for Omnidirectional Cameras
Authors
Nicola Carlon
Emanuele Menegatti
Copyright Year
2013
Publisher
Springer Berlin Heidelberg
DOI
https://doi.org/10.1007/978-3-642-33926-4_31

Premium Partner