skip to main content
10.1145/2390256.2390282acmotherconferencesArticle/Chapter ViewAbstractPublication PagesautomotiveuiConference Proceedingsconference-collections
research-article

Multimodal interaction in the car: combining speech and gestures on the steering wheel

Published:17 October 2012Publication History

ABSTRACT

Implementing controls in the car becomes a major challenge: The use of simple physical buttons does not scale to the increased number of assistive, comfort, and infotainment functions. Current solutions include hierarchical menus and multi-functional control devices, which increase complexity and visual demand. Another option is speech control, which is not widely accepted, as it does not support visibility of actions, fine-grained feedback, and easy undo of actions. Our approach combines speech and gestures. By using speech for identification of functions, we exploit the visibility of objects in the car (e.g., mirror) and simple access to a wide range of functions equaling a very broad menu. Using gestures for manipulation (e.g., left/right), we provide fine-grained control with immediate feedback and easy undo of actions. In a user-centered process, we determined a set of user-defined gestures as well as common voice commands. For a prototype, we linked this to a car interior and driving simulator. In a study with 16 participants, we explored the impact of this form of multimodal interaction on the driving performance against a baseline using physical buttons. The results indicate that the use of speech and gesture is slower than using buttons but results in a similar driving performance. Users comment in a DALI questionnaire that the visual demand is lower when using speech and gestures.

References

  1. I. Alvarez, A. Martin, J. Dunbar, J. Taiber, D.-M. Wilson, and J. E. Gilbert. Designing driver-centric natural voice user interfaces. In Adj. Proc. AutomotiveUI '11, pages 42--49. ACM, 2011.Google ScholarGoogle Scholar
  2. A. Bangor, P. Kortum, and J. Miller. Determining what individual sus scores mean: Adding an adjective rating scale. Journal of Usability Studies, 4(3):114--123, May 2009.Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. R. A. Bolt. "put-that-there": Voice and gesture at the graphics interface. In Proc. SIGGRAPH '80, pages 262--270, 1980. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. J. Brooke. Sus: A quick and dirty usability scale. In Usability evaluation in industry. Taylor and Francis, London, 1996.Google ScholarGoogle Scholar
  5. T. Döring, D. Kern, P. Marshall, M. Pfeiffer, J. Schöning, V. Gruhn, and A. Schmidt. Gestural interaction on the steering wheel: reducing the visual demand. In Proc. CHI '11, pages 483--492. ACM, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. C. Endres, T. Schwartz, and C. A. Müller. Geremin": 2D microgestures for drivers based on electric field sensing. In Proc. IUI '11, pages 327--330. ACM, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. G. Geiser. Man Machine Interaction in Vehicles. ATZ, 87:74--77, 1985.Google ScholarGoogle Scholar
  8. I. E. González, J. O. Wobbrock, D. H. Chau, A. Faulring, and B. A. Myers. Eyes on the road, hands on the wheel: thumb-based interaction techniques for input on steering wheels. In Proc. GI '07, pages 95--102. ACM, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. A. Goulati and D. Szostak. User experience in speech recognition of navigation devices: an assessment. In Proc. MobileHCI '11, pages 517--520. ACM, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. P. Holleis and A. Schmidt. Makeit: Integrate user interaction times in the design process of mobile applications. In Pervasive Computing, volume 5013, pages 56--74. Springer Berlin/Heidelberg, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. D. Kern, A. Mahr, S. Castronovo, A. Schmidt, and C. Müller. Making use of drivers' glances onto the screen for explicit gaze-based interaction. In Proc. AutomotiveUI '10, pages 110--116. ACM, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. D. Kern and A. Schmidt. Design space for driver-based automotive user interfaces. In Proc. AutomotiveUI '09, pages 3--10. ACM, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. D. Kern and S. Schneegass. CARS - configurable automotive research simulator. i-com, 8(2):30--33, 2009.Google ScholarGoogle Scholar
  14. S. Mattes. The lane-change-task as a tool for driver distraction evaluation. Most, pages 1--5, 2003.Google ScholarGoogle Scholar
  15. C. Müller and G. Weinberg. Multimodal input in the car, today and tomorrow. Multimedia, IEEE, 18(1):98--103, Jan. 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. S. Oviatt. The human-computer interaction handbook. chapter Multimodal interfaces, pages 286--304. L. Erlbaum Associates Inc., Hillsdale, NJ, USA, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. A. Pauzie. A method to assess the driver mental workload: The driving activity load index (dali). Humanist, 2(April):315--322, 2008.Google ScholarGoogle Scholar
  18. C. Pickering, K. Burnham, and M. Richardson. A review of automotive human machine interface technologies and techniques to reduce driver distraction. In 2nd IET Conf. on System Safety, pages 223--228, Oct. 2007.Google ScholarGoogle ScholarCross RefCross Ref
  19. L. M. Reeves, J. Lai, J. A. Larson, S. Oviatt, T. S. Balaji, S. Buisine, P. Collings, P. Cohen, B. Kraal, J.-C. Martin, M. McTear, T. Raman, K. M. Stanney, H. Su, and Q. Y. Wang. Guidelines for multimodal user interface design. Commun. ACM, 47(1):57--59, Jan. 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. S. Schneegass, B. Pfleging, D. Kern, and A. Schmidt. Support for modeling interaction with in-vehicle interfaces. In Proc. Automotive UI '11, pages 3--10. ACM, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. R. Sharma, M. Yeasin, N. Krahnstoever, I. Rauschert, G. Cai, I. Brewer, A. MacEachren, and K. Sengupta. Speech-gesture driven multimodal interfaces for crisis management. Proc. IEEE, 91(9):1327--1354, Sept. 2003.Google ScholarGoogle ScholarCross RefCross Ref
  22. R. Spies, A. Blattner, C. Lange, M. Wohlfarter, K. Bengler, and W. Hamberger. Measurement of driver's distraction for an early prove of concepts in automotive industry at the example of the development of a haptic touchpad. In Proc. HCII '11, pages 125--132. Springer-Verlag, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. U. Winter, T. J. Grost, and O. Tsimhoni. Language pattern analysis for automotive natural language speech applications. In Proc. AutomotiveUI '10, pages 34--41. ACM, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. J. O. Wobbrock, M. R. Morris, and A. D. Wilson. User-defined gestures for surface computing. In Proc. CHI '09, pages 1083--1092. ACM, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Multimodal interaction in the car: combining speech and gestures on the steering wheel

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      AutomotiveUI '12: Proceedings of the 4th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
      October 2012
      280 pages
      ISBN:9781450317511
      DOI:10.1145/2390256

      Copyright © 2012 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 17 October 2012

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate248of566submissions,44%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader