Skip to main content
Top
Published in:

22-06-2021 | Original Paper

A wearable virtual touch system for IVIS in cars

Authors: Gowdham Prabhakar, Priyam Rajkhowa, Dharmesh Harsha, Pradipta Biswas

Published in: Journal on Multimodal User Interfaces | Issue 1/2022

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

In automotive domain, operation of secondary tasks like accessing infotainment system, adjusting air conditioning vents, and side mirrors distract drivers from driving. Though existing modalities like gesture and speech recognition systems facilitate undertaking secondary tasks by reducing duration of eyes off the road, those often require remembering a set of gestures or screen sequences. In this paper, we have proposed two different modalities for drivers to virtually touch the dashboard display using a laser tracker with a mechanical switch and an eye gaze switch. We compared performances of our proposed modalities against conventional touch modality in automotive environment by comparing pointing and selection times of representative secondary task and also analysed effect on driving performance in terms of deviation from lane, average speed, variation in perceived workload and system usability. We did not find significant difference in driving and pointing performance between laser tracking system and existing touchscreen system. Our result also showed that the driving and pointing performance of the virtual touch system with eye gaze switch was significantly better than the same with mechanical switch. We evaluated the efficacy of the proposed virtual touch system with eye gaze switch inside a real car and investigated acceptance of the system by professional drivers using qualitative research. The quantitative and qualitative studies indicated importance of using multimodal system inside car and highlighted several criteria for acceptance of new automotive user interface.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Appendix
Available only for authorised users
Literature
1.
go back to reference Adell E (2009) Driver experience and acceptance of driver support systems-a case of speed adaptation. Lund University 125(126):148 Adell E (2009) Driver experience and acceptance of driver support systems-a case of speed adaptation. Lund University 125(126):148
2.
go back to reference Aguilar SR, Merino JLM, Sánchez AM, Valdivieso ÁS (2015) Variation of the heartbeat and activity as an indicator of drowsiness at the wheel using a smartwatch. Int J Artif Intell Interact Multimedia 3 Aguilar SR, Merino JLM, Sánchez AM, Valdivieso ÁS (2015) Variation of the heartbeat and activity as an indicator of drowsiness at the wheel using a smartwatch. Int J Artif Intell Interact Multimedia 3
3.
go back to reference Ahmad BI, Langdon PM, Godsill SJ, Hardy R, Dias E, Skrypchuk L (2014) Interactive displays in vehicles: Improving usability with a pointing gesture tracker and Bayesian intent predictors. In proceedings of the 6th international conference on automotive user interfaces and interactive vehicular applications (pp. 1–8). ACM Ahmad BI, Langdon PM, Godsill SJ, Hardy R, Dias E, Skrypchuk L (2014) Interactive displays in vehicles: Improving usability with a pointing gesture tracker and Bayesian intent predictors. In proceedings of the 6th international conference on automotive user interfaces and interactive vehicular applications (pp. 1–8). ACM
4.
go back to reference Ahmad BI, Langdon PM, Godsill SJ, Donkor R, Wilde R, Skrypchuk L (2016) You do not have to touch to select: a study on predictive in-car touchscreen with mid-air selection. In proceedings of the 8th international conference on automotive user interfaces and interactive vehicular applications (pp. 113–120). ACM Ahmad BI, Langdon PM, Godsill SJ, Donkor R, Wilde R, Skrypchuk L (2016) You do not have to touch to select: a study on predictive in-car touchscreen with mid-air selection. In proceedings of the 8th international conference on automotive user interfaces and interactive vehicular applications (pp. 113–120). ACM
5.
go back to reference Amoura C, Berjot S, Gillet N, Altintas E (2014) Desire for control, perception of control: their impact on autonomous motivation and psychological adjustment. Motiv Emot 38(3):323–335CrossRef Amoura C, Berjot S, Gillet N, Altintas E (2014) Desire for control, perception of control: their impact on autonomous motivation and psychological adjustment. Motiv Emot 38(3):323–335CrossRef
6.
go back to reference [Ayata 2018] Ayata, D., Yaslan, Y., & Kamasak, M. E. (2018). Emotion Based Music Recommendation System Using Wearable Physiological Sensors. IEEE Transactions on Consumer Electronics. [Ayata 2018] Ayata, D., Yaslan, Y., & Kamasak, M. E. (2018). Emotion Based Music Recommendation System Using Wearable Physiological Sensors. IEEE Transactions on Consumer Electronics.
7.
go back to reference Baguley T, Andrews M (2016) Handling missing data. In: Robertson J, Kaptein M (eds) Modern statistical methods for HCI. Springer, pp 57–82CrossRef Baguley T, Andrews M (2016) Handling missing data. In: Robertson J, Kaptein M (eds) Modern statistical methods for HCI. Springer, pp 57–82CrossRef
8.
go back to reference Biswas P, Roy S, Prabhakar, G, Rajesh J, Arjun S, Arora M, Gurumoorthy B, Chakrabarti A, Interactive sensor visualization for smart manufacturing system, proceedings of the 31st British human computer interaction conference 2017 (British HCI 17) Biswas P, Roy S, Prabhakar, G, Rajesh J, Arjun S, Arora M, Gurumoorthy B, Chakrabarti A, Interactive sensor visualization for smart manufacturing system, proceedings of the 31st British human computer interaction conference 2017 (British HCI 17)
9.
go back to reference Biswas P, Aydemir GA, Langdon P, Godsill S (2013) Intent recognition using neural networks and Kalman filters. In Human-computer interaction and knowledge discovery in complex, unstructured, Big Data. Springer, Berlin, Heidelberg, pp. 112–123 Biswas P, Aydemir GA, Langdon P, Godsill S (2013) Intent recognition using neural networks and Kalman filters. In Human-computer interaction and knowledge discovery in complex, unstructured, Big Data. Springer, Berlin, Heidelberg, pp. 112–123
10.
go back to reference Biswas P, Langdon P (2014) Multimodal target prediction model. In CHI'14 Extended abstracts on human factors in computing systems. ACM pp. 1543–1548 Biswas P, Langdon P (2014) Multimodal target prediction model. In CHI'14 Extended abstracts on human factors in computing systems. ACM pp. 1543–1548
11.
go back to reference Biswas P, Langdon P (2015) Multimodal intelligent eye-gaze tracking system. Int J Human Comput Interact 31(4):277–294CrossRef Biswas P, Langdon P (2015) Multimodal intelligent eye-gaze tracking system. Int J Human Comput Interact 31(4):277–294CrossRef
12.
go back to reference Biswas P, Prabhakar, G, Rajesh J, Pandit K, Halder A (2017) Improving eye gaze controlled car dashboard using simulated annealing. In Proceedings of the 31st British computer society human computer interaction conference (p. 39). BCS Learning & Development Ltd Biswas P, Prabhakar, G, Rajesh J, Pandit K, Halder A (2017) Improving eye gaze controlled car dashboard using simulated annealing. In Proceedings of the 31st British computer society human computer interaction conference (p. 39). BCS Learning & Development Ltd
13.
go back to reference Chang W, Hwang W, Ji YG (2011) Haptic seat interfaces for driver information and warning systems. Int J Human Comput Interact 27(12):1119–1132CrossRef Chang W, Hwang W, Ji YG (2011) Haptic seat interfaces for driver information and warning systems. Int J Human Comput Interact 27(12):1119–1132CrossRef
14.
go back to reference Corbin J (2015) Basics of qualitative research. Sage Publications Corbin J (2015) Basics of qualitative research. Sage Publications
15.
go back to reference Debnath A, Kobra KT, Rawshan PP, Paramita M, Islam MN (2018) An explication of acceptability of wearable devices in context of bangladesh: a user study. In 2018 IEEE 6th international conference on future internet of things and cloud (FiCloud). IEEE pp. 136–140 Debnath A, Kobra KT, Rawshan PP, Paramita M, Islam MN (2018) An explication of acceptability of wearable devices in context of bangladesh: a user study. In 2018 IEEE 6th international conference on future internet of things and cloud (FiCloud). IEEE pp. 136–140
16.
go back to reference Dey P, Paul A, Saha D, Mukherjee S, Nath A (2012) Laser beam operated windows operation. In 2012 international conference on communication systems and network technologies. IEEE pp. 594–599 Dey P, Paul A, Saha D, Mukherjee S, Nath A (2012) Laser beam operated windows operation. In 2012 international conference on communication systems and network technologies. IEEE pp. 594–599
17.
go back to reference Fitts PM (1954) The information capacity of the human motor system in controlling the amplitude of movement. J Exp Psychol 47(6):381CrossRef Fitts PM (1954) The information capacity of the human motor system in controlling the amplitude of movement. J Exp Psychol 47(6):381CrossRef
18.
go back to reference Ganz A, Schafer JM, Tao Y, Wilson C, Robertson M (2014) PERCEPT-II: smartphone based indoor navigation system for the blind, 2014 36th annual international conference of the IEEE engineering in medicine and biology society, Chicago, IL, USA, pp. 3662-3665, https://doi.org/10.1109/EMBC.2014.6944417 Ganz A, Schafer JM, Tao Y, Wilson C, Robertson M (2014) PERCEPT-II: smartphone based indoor navigation system for the blind, 2014 36th annual international conference of the IEEE engineering in medicine and biology society, Chicago, IL, USA, pp. 3662-3665, https://​doi.​org/​10.​1109/​EMBC.​2014.​6944417
19.
go back to reference Gorlewicz JL, Tennison JL, Uesbeck PM, Richard ME, Palani HP, Stefik A, Smith DW, Giudice NA (2020) Design guidelines and recommendations for multimodal, touchscreen-based graphics. ACM Trans Access Comput (TACCESS) 13(3):1–30 Gorlewicz JL, Tennison JL, Uesbeck PM, Richard ME, Palani HP, Stefik A, Smith DW, Giudice NA (2020) Design guidelines and recommendations for multimodal, touchscreen-based graphics. ACM Trans Access Comput (TACCESS) 13(3):1–30
21.
go back to reference Kern D, Schmidt A (2009) Design space for driver-based automotive user interfaces. In Proceedings of the 1st international conference on automotive user interfaces and interactive vehicular applications (AutomotiveUI '09). Association for computing machinery, New York, NY, USA, 3–10. https://doi.org/10.1145/1620509.1620511 Kern D, Schmidt A (2009) Design space for driver-based automotive user interfaces. In Proceedings of the 1st international conference on automotive user interfaces and interactive vehicular applications (AutomotiveUI '09). Association for computing machinery, New York, NY, USA, 3–10. https://​doi.​org/​10.​1145/​1620509.​1620511
22.
go back to reference Kim JH, Lim JH, Jo CI, Kim K (2015) Utilization of visual information perception characteristics to improve classification accuracy of driver’s visual search intention for intelligent vehicle. Int J Human Comput Interact 31(10):717–729CrossRef Kim JH, Lim JH, Jo CI, Kim K (2015) Utilization of visual information perception characteristics to improve classification accuracy of driver’s visual search intention for intelligent vehicle. Int J Human Comput Interact 31(10):717–729CrossRef
23.
go back to reference Kundinger T, Yalavarthi PK, Riener A, Wintersberger P, Schartmüller C (2020) Feasibility of smart wearables for driver drowsiness detection and its potential among different age groups. Int J Pervasive Comput Commun 16(1) Kundinger T, Yalavarthi PK, Riener A, Wintersberger P, Schartmüller C (2020) Feasibility of smart wearables for driver drowsiness detection and its potential among different age groups. Int J Pervasive Comput Commun 16(1)
24.
go back to reference Lank E, Cheng YCN, Ruiz J (2007) Endpoint prediction using motion kinematics. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems, ACM, pp. 637–646 Lank E, Cheng YCN, Ruiz J (2007) Endpoint prediction using motion kinematics. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems, ACM, pp. 637–646
25.
go back to reference Liang Y, Reyes ML, Lee JD (2007) Real-time detection of driver cognitive distraction using support vector machines. IEEE Trans Intell Transp Syst 8:340–350CrossRef Liang Y, Reyes ML, Lee JD (2007) Real-time detection of driver cognitive distraction using support vector machines. IEEE Trans Intell Transp Syst 8:340–350CrossRef
26.
go back to reference Mattes S (2003) The lane-change-task as a tool for driver distraction evaluation. Qual Work Prod Enterp Future 57:60 Mattes S (2003) The lane-change-task as a tool for driver distraction evaluation. Qual Work Prod Enterp Future 57:60
28.
go back to reference Mulloni A, Seichter H, Schmalstieg D (2011) Handheld augmented reality indoor navigation with activity-based instructions. In Proceedings of the 13th international conference on human computer interaction with mobile devices and services (MobileHCI '11). Association for Computing Machinery, New York, NY, USA, 211–220 Mulloni A, Seichter H, Schmalstieg D (2011) Handheld augmented reality indoor navigation with activity-based instructions. In Proceedings of the 13th international conference on human computer interaction with mobile devices and services (MobileHCI '11). Association for Computing Machinery, New York, NY, USA, 211–220
29.
go back to reference Murata A (1998) Improvement of pointing time by predicting targets in pointing with a PC mouse. Int J Human Comput Interact 10(1):23–32CrossRef Murata A (1998) Improvement of pointing time by predicting targets in pointing with a PC mouse. Int J Human Comput Interact 10(1):23–32CrossRef
30.
go back to reference NHTSA (2012) Visual-Manual NHTSA driver distraction guidelines for in-vehicle electronic devices: notice of proposed federal guidelines. Fed Reg 77(37):11199–11250 NHTSA (2012) Visual-Manual NHTSA driver distraction guidelines for in-vehicle electronic devices: notice of proposed federal guidelines. Fed Reg 77(37):11199–11250
31.
go back to reference Nordhoff S, De Winter J, Kyriakidis M, Van Arem B, Happee R (2018) Acceptance of driverless vehicles: results from a large cross-national questionnaire study. J Adv Transp 2018 Nordhoff S, De Winter J, Kyriakidis M, Van Arem B, Happee R (2018) Acceptance of driverless vehicles: results from a large cross-national questionnaire study. J Adv Transp 2018
32.
go back to reference Normark CJ (2015) Design and evaluation of a touch-based personalizable in-vehicle user interface. Int J Human Comput Interact 31(11):731–745CrossRef Normark CJ (2015) Design and evaluation of a touch-based personalizable in-vehicle user interface. Int J Human Comput Interact 31(11):731–745CrossRef
33.
go back to reference Ohn-Bar E, Trivedi MM (2014) Hand gesture recognition in real time for automotive interfaces: a multimodal vision-based approach and evaluations. IEEE Trans Intell Transp Syst 15(6):2368–2377CrossRef Ohn-Bar E, Trivedi MM (2014) Hand gesture recognition in real time for automotive interfaces: a multimodal vision-based approach and evaluations. IEEE Trans Intell Transp Syst 15(6):2368–2377CrossRef
34.
go back to reference Palani HP, Fink PD, Giudice NA (2020) Design guidelines for schematizing and rendering haptically perceivable graphical elements on touchscreen devices. Int J Human Comput Interact 36(15):1393–1414 Palani HP, Fink PD, Giudice NA (2020) Design guidelines for schematizing and rendering haptically perceivable graphical elements on touchscreen devices. Int J Human Comput Interact 36(15):1393–1414
35.
go back to reference Pasqual PT, Wobbrock JO (2014) Mouse pointing endpoint prediction using kinematic template matching. In Proceedings of the SIGCHI conference on human factors in computing systems. ACM, pp. 743–752 Pasqual PT, Wobbrock JO (2014) Mouse pointing endpoint prediction using kinematic template matching. In Proceedings of the SIGCHI conference on human factors in computing systems. ACM, pp. 743–752
36.
go back to reference Prabhakar G, Rajesh J, Biswas P (2016) Comparison of three hand movement tracking sensors as cursor controllers. In Control, instrumentation, communication and computational technologies (ICCICCT), 2016 International Conference on. IEEE pp. 358–364 Prabhakar G, Rajesh J, Biswas P (2016) Comparison of three hand movement tracking sensors as cursor controllers. In Control, instrumentation, communication and computational technologies (ICCICCT), 2016 International Conference on. IEEE pp. 358–364
37.
go back to reference Prabhakar G, Biswas P (2017) Evaluation of laser pointer as a pointing device in automotive. In 2017 international conference on intelligent computing, instrumentation and control technologies (ICICICT). IEEE pp. 364–371 Prabhakar G, Biswas P (2017) Evaluation of laser pointer as a pointing device in automotive. In 2017 international conference on intelligent computing, instrumentation and control technologies (ICICICT). IEEE pp. 364–371
38.
go back to reference Prabhakar G, Ramakrishnan A, Murthy LRD, Sharma VK, Madan M, Deshmukh S, Biswas P (2019) Interactive Gaze & finger controlled HUD for Cars. J Multimod User Interf 14:101–121 Prabhakar G, Ramakrishnan A, Murthy LRD, Sharma VK, Madan M, Deshmukh S, Biswas P (2019) Interactive Gaze & finger controlled HUD for Cars. J Multimod User Interf 14:101–121
39.
go back to reference Rocha S, Lopes A (2020) Navigation based application with augmented reality and accessibility. In Extended abstracts of the 2020 CHI conference on human factors in computing systems (CHI EA '20). Association for computing machinery, New York, NY, USA, 1–9 Rocha S, Lopes A (2020) Navigation based application with augmented reality and accessibility. In Extended abstracts of the 2020 CHI conference on human factors in computing systems (CHI EA '20). Association for computing machinery, New York, NY, USA, 1–9
40.
go back to reference Schmidtler J, Bengler K, Dimeas F, Campeau-Lecours A (2017) A questionnaire for the evaluation of physical assistive devices (quead): testing usability and acceptance in physical human-robot interaction. In 2017 IEEE international conference on systems, man, and cybernetics (SMC). IEEE pp. 876–881 Schmidtler J, Bengler K, Dimeas F, Campeau-Lecours A (2017) A questionnaire for the evaluation of physical assistive devices (quead): testing usability and acceptance in physical human-robot interaction. In 2017 IEEE international conference on systems, man, and cybernetics (SMC). IEEE pp. 876–881
41.
go back to reference Schnelle-Walka D, Radomski S (2019) Automotive multimodal human-machine interface. In: The handbook of multimodal-multisensor interfaces: language processing, software, commer- cialization, and emerging directions, vol 3. pp 477–522 Schnelle-Walka D, Radomski S (2019) Automotive multimodal human-machine interface. In: The handbook of multimodal-multisensor interfaces: language processing, software, commer- cialization, and emerging directions, vol 3. pp 477–522
42.
go back to reference Spagnolli A, Guardigli E, Orso V, Varotto A, Gamberini L (2015) Measuring user acceptance of wearable symbiotic devices: validation study across application scenarios. In International workshop on symbiotic interaction. Springer, Cham, pp. 87–98 Spagnolli A, Guardigli E, Orso V, Varotto A, Gamberini L (2015) Measuring user acceptance of wearable symbiotic devices: validation study across application scenarios. In International workshop on symbiotic interaction. Springer, Cham, pp. 87–98
43.
go back to reference Steinberger F, Schroeter R, Babiac D (2017) Engaged drivers–safe drivers: gathering real-time data from mobile and wearable devices for safe-driving apps. In Automotive user interfaces. Springer, Cham, pp. 55–76 Steinberger F, Schroeter R, Babiac D (2017) Engaged drivers–safe drivers: gathering real-time data from mobile and wearable devices for safe-driving apps. In Automotive user interfaces. Springer, Cham, pp. 55–76
44.
go back to reference Stern RM, Ray WJ, Quigley KS (2001) Psychophysiological recording. Oxford University Press Stern RM, Ray WJ, Quigley KS (2001) Psychophysiological recording. Oxford University Press
45.
go back to reference San Vito PDC, Shakeri G, Brewster SA, Pollick FE, Brown E, Skrypchuk L, Mouzakitis A (2019) Haptic Navigation Cues On The Steering Wheel. In CHI (p. 210) San Vito PDC, Shakeri G, Brewster SA, Pollick FE, Brown E, Skrypchuk L, Mouzakitis A (2019) Haptic Navigation Cues On The Steering Wheel. In CHI (p. 210)
46.
go back to reference Weinberg G, Knowles A, Langer P (2012) Bullseye: an automotive touch interface that’s always on target. In Adjunct! Proceedings!. p. 43 Weinberg G, Knowles A, Langer P (2012) Bullseye: an automotive touch interface that’s always on target. In Adjunct! Proceedings!. p. 43
47.
go back to reference Witkowski Todd R, Kurt A Dykema, Steven L Geerlings, Mark L Zeinstra, Robert F Buege (2014) Wireless control system and method. U.S. Patent 8,634,888, issued January 21 Witkowski Todd R, Kurt A Dykema, Steven L Geerlings, Mark L Zeinstra, Robert F Buege (2014) Wireless control system and method. U.S. Patent 8,634,888, issued January 21
48.
go back to reference Woelfl G (2020) U.S. Patent No. 10,674,268. Washington, DC: U.S. Patent and Trademark Office Woelfl G (2020) U.S. Patent No. 10,674,268. Washington, DC: U.S. Patent and Trademark Office
49.
go back to reference Woodworth RS (1899) The accuracy of voluntary movement. Psychol Revi, pp. 1–119 Woodworth RS (1899) The accuracy of voluntary movement. Psychol Revi, pp. 1–119
50.
go back to reference Yerkes RM, Dodson JD (1908) The relation of strength of stimulus to rapidity of habit formation. The J Comp Neurol 27–41 Yerkes RM, Dodson JD (1908) The relation of strength of stimulus to rapidity of habit formation. The J Comp Neurol 27–41
51.
go back to reference Zhang Y, Lin WC, Chin YKS (2010) A pattern-recognition approach for driving skill characterization. IEEE Trans Intell Transp Syst 11:905–916CrossRef Zhang Y, Lin WC, Chin YKS (2010) A pattern-recognition approach for driving skill characterization. IEEE Trans Intell Transp Syst 11:905–916CrossRef
52.
go back to reference Ziebart BD (2010) Modeling purposeful adaptive behavior with the principle of maximum causal entropy Ziebart BD (2010) Modeling purposeful adaptive behavior with the principle of maximum causal entropy
Metadata
Title
A wearable virtual touch system for IVIS in cars
Authors
Gowdham Prabhakar
Priyam Rajkhowa
Dharmesh Harsha
Pradipta Biswas
Publication date
22-06-2021
Publisher
Springer International Publishing
Published in
Journal on Multimodal User Interfaces / Issue 1/2022
Print ISSN: 1783-7677
Electronic ISSN: 1783-8738
DOI
https://doi.org/10.1007/s12193-021-00377-9

Other articles of this Issue 1/2022

Journal on Multimodal User Interfaces 1/2022 Go to the issue

Premium Partner