ABSTRACT
Implementing controls in the car becomes a major challenge: The use of simple physical buttons does not scale to the increased number of assistive, comfort, and infotainment functions. Current solutions include hierarchical menus and multi-functional control devices, which increase complexity and visual demand. Another option is speech control, which is not widely accepted, as it does not support visibility of actions, fine-grained feedback, and easy undo of actions. Our approach combines speech and gestures. By using speech for identification of functions, we exploit the visibility of objects in the car (e.g., mirror) and simple access to a wide range of functions equaling a very broad menu. Using gestures for manipulation (e.g., left/right), we provide fine-grained control with immediate feedback and easy undo of actions. In a user-centered process, we determined a set of user-defined gestures as well as common voice commands. For a prototype, we linked this to a car interior and driving simulator. In a study with 16 participants, we explored the impact of this form of multimodal interaction on the driving performance against a baseline using physical buttons. The results indicate that the use of speech and gesture is slower than using buttons but results in a similar driving performance. Users comment in a DALI questionnaire that the visual demand is lower when using speech and gestures.
- I. Alvarez, A. Martin, J. Dunbar, J. Taiber, D.-M. Wilson, and J. E. Gilbert. Designing driver-centric natural voice user interfaces. In Adj. Proc. AutomotiveUI '11, pages 42--49. ACM, 2011.Google Scholar
- A. Bangor, P. Kortum, and J. Miller. Determining what individual sus scores mean: Adding an adjective rating scale. Journal of Usability Studies, 4(3):114--123, May 2009.Google ScholarDigital Library
- R. A. Bolt. "put-that-there": Voice and gesture at the graphics interface. In Proc. SIGGRAPH '80, pages 262--270, 1980. Google ScholarDigital Library
- J. Brooke. Sus: A quick and dirty usability scale. In Usability evaluation in industry. Taylor and Francis, London, 1996.Google Scholar
- T. Döring, D. Kern, P. Marshall, M. Pfeiffer, J. Schöning, V. Gruhn, and A. Schmidt. Gestural interaction on the steering wheel: reducing the visual demand. In Proc. CHI '11, pages 483--492. ACM, 2011. Google ScholarDigital Library
- C. Endres, T. Schwartz, and C. A. Müller. Geremin": 2D microgestures for drivers based on electric field sensing. In Proc. IUI '11, pages 327--330. ACM, 2011. Google ScholarDigital Library
- G. Geiser. Man Machine Interaction in Vehicles. ATZ, 87:74--77, 1985.Google Scholar
- I. E. González, J. O. Wobbrock, D. H. Chau, A. Faulring, and B. A. Myers. Eyes on the road, hands on the wheel: thumb-based interaction techniques for input on steering wheels. In Proc. GI '07, pages 95--102. ACM, 2007. Google ScholarDigital Library
- A. Goulati and D. Szostak. User experience in speech recognition of navigation devices: an assessment. In Proc. MobileHCI '11, pages 517--520. ACM, 2011. Google ScholarDigital Library
- P. Holleis and A. Schmidt. Makeit: Integrate user interaction times in the design process of mobile applications. In Pervasive Computing, volume 5013, pages 56--74. Springer Berlin/Heidelberg, 2008. Google ScholarDigital Library
- D. Kern, A. Mahr, S. Castronovo, A. Schmidt, and C. Müller. Making use of drivers' glances onto the screen for explicit gaze-based interaction. In Proc. AutomotiveUI '10, pages 110--116. ACM, 2010. Google ScholarDigital Library
- D. Kern and A. Schmidt. Design space for driver-based automotive user interfaces. In Proc. AutomotiveUI '09, pages 3--10. ACM, 2009. Google ScholarDigital Library
- D. Kern and S. Schneegass. CARS - configurable automotive research simulator. i-com, 8(2):30--33, 2009.Google Scholar
- S. Mattes. The lane-change-task as a tool for driver distraction evaluation. Most, pages 1--5, 2003.Google Scholar
- C. Müller and G. Weinberg. Multimodal input in the car, today and tomorrow. Multimedia, IEEE, 18(1):98--103, Jan. 2011. Google ScholarDigital Library
- S. Oviatt. The human-computer interaction handbook. chapter Multimodal interfaces, pages 286--304. L. Erlbaum Associates Inc., Hillsdale, NJ, USA, 2003. Google ScholarDigital Library
- A. Pauzie. A method to assess the driver mental workload: The driving activity load index (dali). Humanist, 2(April):315--322, 2008.Google Scholar
- C. Pickering, K. Burnham, and M. Richardson. A review of automotive human machine interface technologies and techniques to reduce driver distraction. In 2nd IET Conf. on System Safety, pages 223--228, Oct. 2007.Google ScholarCross Ref
- L. M. Reeves, J. Lai, J. A. Larson, S. Oviatt, T. S. Balaji, S. Buisine, P. Collings, P. Cohen, B. Kraal, J.-C. Martin, M. McTear, T. Raman, K. M. Stanney, H. Su, and Q. Y. Wang. Guidelines for multimodal user interface design. Commun. ACM, 47(1):57--59, Jan. 2004. Google ScholarDigital Library
- S. Schneegass, B. Pfleging, D. Kern, and A. Schmidt. Support for modeling interaction with in-vehicle interfaces. In Proc. Automotive UI '11, pages 3--10. ACM, 2011. Google ScholarDigital Library
- R. Sharma, M. Yeasin, N. Krahnstoever, I. Rauschert, G. Cai, I. Brewer, A. MacEachren, and K. Sengupta. Speech-gesture driven multimodal interfaces for crisis management. Proc. IEEE, 91(9):1327--1354, Sept. 2003.Google ScholarCross Ref
- R. Spies, A. Blattner, C. Lange, M. Wohlfarter, K. Bengler, and W. Hamberger. Measurement of driver's distraction for an early prove of concepts in automotive industry at the example of the development of a haptic touchpad. In Proc. HCII '11, pages 125--132. Springer-Verlag, 2011. Google ScholarDigital Library
- U. Winter, T. J. Grost, and O. Tsimhoni. Language pattern analysis for automotive natural language speech applications. In Proc. AutomotiveUI '10, pages 34--41. ACM, 2010. Google ScholarDigital Library
- J. O. Wobbrock, M. R. Morris, and A. D. Wilson. User-defined gestures for surface computing. In Proc. CHI '09, pages 1083--1092. ACM, 2009. Google ScholarDigital Library
Index Terms
- Multimodal interaction in the car: combining speech and gestures on the steering wheel
Recommendations
Seamless interaction in space
OzCHI '11: Proceedings of the 23rd Australian Computer-Human Interaction ConferenceAs more electronic devices enter the living room, there is a need to explore new ways to provide seamless interaction with them over a range of different distances. In this paper we describe a proximity-based interface that allows users to interact with ...
AutoNUI: a workshop on automotive natural user interfaces
AutomotiveUI '11: Proceedings of the 3rd International Conference on Automotive User Interfaces and Interactive Vehicular ApplicationsNatural user interfaces by means of gesture and speech interaction have become a hot topic in research as well as already for real products. Most use cases currently center around consumer electronics devices like smart phones, TV sets, gaming, or other ...
A tactile interaction concept for in-car passenger infotainment systems
AutomotiveUI '19: Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications: Adjunct ProceedingsMany modern cars offer in-vehicle infotainment systems to enable information and entertainment features. Often, these systems use touchscreen-based interaction concepts, which can be tedious (holding the arm) and imprecise due to the mobile context. In ...
Comments