At present, algorithms for attitude estimation with omnidirectional cameras are predominantly environment-dependent. This constitutes a significant limitation to the applicability of such techniques. This study introduces an approach aimed at general mobile camera attitude estimation. The approach extracts features to directly estimate three-dimensional movements of a humanoid robot from its head-mounted camera. By doing so, it is not subject to the constraints of Structure from Motion with epipolar geometry, which are currently unattainable in real-time. The central idea is: movements between consecutive frames can be reliably estimated from the identity on the unit sphere between external parallel lines and projected great circles. After calibration, parallel lines match optical flow tracks. The point of infinity corresponds to the expansion focus of the movement. Simulations and experiments validate the ability to distinguish between translation, pure rotation, and roto-translation.
Swipe to navigate through the chapters of this book
Please log in to get access to this content
- Visual Gyroscope for Omnidirectional Cameras
- Copyright Year
- Springer Berlin Heidelberg