ABSTRACT
Head Up Displays (HUDs) have the advantage to be visible in the line-of-sight of the driver, minimizing visual distraction. Otherwise, it is not easy to manipulate them since they are virtually positioned behind the windshield. We used a Leap Motion to achieve a gesture-controlled HUD. We conducted a simulator study with two variations of a simplified HUD: one with three segments and one with four segments. We show that the 3-segment HUD is superior to the 4-segment HUD in terms of interaction time and error rate. We provide data on the horizontal angle a HUD segment is chosen with the index finger of the right hand when selecting one of the three respectively four segments of the HUD. Our results can inform HUD interaction designers on interpreting mid-air pointing gestures to achieve higher success rates.
- Felix Lauber, Claudius Böttcher, and Andreas Butz. 2014. You'Ve Got the Look: Visualizing Infotainment Shortcuts in Head-Mounted Displays. In Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '14). ACM, New York, NY, USA, Article 3, 8 pages. Google ScholarDigital Library
- Keenan R May, Thomas M Gable, and Bruce N Walker. 2014. A Multimodal Air Gesture Interface for In Vehicle Menu Navigation. In Adjunct Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '14). ACM, New York, NY, USA, 1--6.Google ScholarDigital Library
- Eshed Ohn-Bar, Cuong Tran, and Mohan Trivedi. 2012. Hand Gesture-based Visual User Interface for Infotainment. In Proceedings of the 4th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '12). ACM, New York, NY, USA, 111--115. Google ScholarDigital Library
- Andreas Riener and Alois Ferscha et. al. 2013. Standardization of the In-car Gesture Interaction Space. In Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '13). ACM, New York, NY, USA, 14--21. Google ScholarDigital Library
- Sonja Rümelin, Chadly Marouane, and Andreas Butz. 2013. Free-hand Pointing for Identification and Interaction with Distant Objects. In Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '13). ACM, New York, NY, USA, 40--47. Google ScholarDigital Library
Index Terms
- Pointing at the HUD: Gesture Interaction Using a Leap Motion
Recommendations
T9+HUD: Physical Keypad and HUD can Improve Driving Performance while Typing and Driving
Automotive'UI 16: Proceedings of the 8th International Conference on Automotive User Interfaces and Interactive Vehicular ApplicationsWe introduce T9+HUD, a text entry method designed to decrease visual distraction while driving and typing. T9+HUD combines a physical 3x4 keypad on the steering wheel with a head-up-display (HUD) for projecting output on the windshield. Previous work ...
3D finger tracking and recognition image processing for real-time music playing with depth sensors
In this research, we propose a state-of-the-art 3D finger gesture tracking and recognition method. We use the depth sensors for both hands in real time music playing. In line with the development of 3D depth cameras, we implemented a set of 3D gesture-...
A left-turn driving aid using projected oncoming vehicle paths with augmented reality
AutomotiveUI '13: Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular ApplicationsMaking left turns across oncoming traffic without a protected left-turn signal is a significant safety concern at intersections. In a left turn situation, the driver typically does not have the right of way and must determine when to initiate the turn ...
Comments