Skip to main content
Erschienen in: Journal on Multimodal User Interfaces 4/2019

22.07.2019 | Original Paper

Time Well Spent with multimodal mobile interactions

verfasst von: Nadia Elouali

Erschienen in: Journal on Multimodal User Interfaces | Ausgabe 4/2019

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Mobile users lose a lot of time on their smartphones. They interact even with busy hands (hands-free interactions), distracted eyes (eyes-free interactions) and in different life situations (while walking, eating, working, etc.). The Time Well Spent (TWS) is a movement that aims to design applications which respect the users choices and availability. In this paper, we discuss how the multimodal mobile interactions and the TWS movement can support each other to protect the users time. We start by giving an overview of mobile multimodal interaction and highlighting the TWS concept. Then, we present our vision about applying mobile multimodality as a means to protect the users time. We show that multimodality can support the TWS by encouraging self-restraint, while the TWS supports multimodality by making the interaction modalities meaningful. Finally, we discuss our future works in this context.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Bolt R (1980) Put that there: voice and gesture at the graphics interface. In: Proceeding SIGGRAPH ’80 Proceedings of the 7th annual conference on Computer graphics and interactive techniques. ACM, New York, pp 262–270 Bolt R (1980) Put that there: voice and gesture at the graphics interface. In: Proceeding SIGGRAPH ’80 Proceedings of the 7th annual conference on Computer graphics and interactive techniques. ACM, New York, pp 262–270
2.
Zurück zum Zitat Elouali N, Rouillard J, Le Pallec X, Tarby J-C (2013) Multimodal interaction: a survey from model driven engineering and mobile perspectives. J Multimodal User Interfaces 7:351–370CrossRef Elouali N, Rouillard J, Le Pallec X, Tarby J-C (2013) Multimodal interaction: a survey from model driven engineering and mobile perspectives. J Multimodal User Interfaces 7:351–370CrossRef
3.
Zurück zum Zitat Coutaz J, Nigay L, Salber D, Blandford A, May JY (1995) Four easy pieces for assessing the usability of multimodal interaction: the CARE properties. In: INTERACT’95, pp 115–120 Coutaz J, Nigay L, Salber D, Blandford A, May JY (1995) Four easy pieces for assessing the usability of multimodal interaction: the CARE properties. In: INTERACT’95, pp 115–120
4.
Zurück zum Zitat Zhang M, Li P, Yang P, Xiong J, Tian C (2016) Poster: Sonicnect: accurate hands-free gesture input system with smart acoustic sensing. In: MobiSys (Companion Volume) Zhang M, Li P, Yang P, Xiong J, Tian C (2016) Poster: Sonicnect: accurate hands-free gesture input system with smart acoustic sensing. In: MobiSys (Companion Volume)
5.
Zurück zum Zitat Tinwala H, MacKenzie IS (2009) Eyes-free text entry on a touchscreen phone. In: IEEE Toronto international conference science and technology for humanity TIC-STH 2009, pp 83–89 Tinwala H, MacKenzie IS (2009) Eyes-free text entry on a touchscreen phone. In: IEEE Toronto international conference science and technology for humanity TIC-STH 2009, pp 83–89
6.
Zurück zum Zitat Dicke C, Wolf K, Tal Y (2010) Foogue: eyes-free interaction for smartphones. In: MobileHCI’10 Dicke C, Wolf K, Tal Y (2010) Foogue: eyes-free interaction for smartphones. In: MobileHCI’10
7.
Zurück zum Zitat Elouali N, Le Pallec X, Rouillard J, Tarby JC (2014) MIMIC: leveraging sensor-based interactions in multimodal mobile applications. In: CHI ’14 extended abstracts on human factors in computing systems, pp 2323–2328 Elouali N, Le Pallec X, Rouillard J, Tarby JC (2014) MIMIC: leveraging sensor-based interactions in multimodal mobile applications. In: CHI ’14 extended abstracts on human factors in computing systems, pp 2323–2328
8.
Zurück zum Zitat Falaki H, Mahajan R, Kandula S, Lymberopoulos D, Govindan R, Estrin D (2010) Diversity in smartphone usage. In: ACM MobiSys Falaki H, Mahajan R, Kandula S, Lymberopoulos D, Govindan R, Estrin D (2010) Diversity in smartphone usage. In: ACM MobiSys
9.
Zurück zum Zitat Roberts JA, Pullig C, Manolis C (2015) I need my smartphone: a hierarchical model of personality and cell-phone addiction. Personal Individ Differ 79:139CrossRef Roberts JA, Pullig C, Manolis C (2015) I need my smartphone: a hierarchical model of personality and cell-phone addiction. Personal Individ Differ 79:139CrossRef
10.
Zurück zum Zitat Mark G, Shamsi TI, Czerwinski M, Johns P, Sano A, Yuliya L (2016) Email duration, batching and self-interruption: patterns of email use on productivity and stress. In: CHI’2016, pp 1717–1728 Mark G, Shamsi TI, Czerwinski M, Johns P, Sano A, Yuliya L (2016) Email duration, batching and self-interruption: patterns of email use on productivity and stress. In: CHI’2016, pp 1717–1728
11.
Zurück zum Zitat Obrenovic Z, Starcevic D (2004) Modelling multimodal human-computer interaction. Computer 37:65–72CrossRef Obrenovic Z, Starcevic D (2004) Modelling multimodal human-computer interaction. Computer 37:65–72CrossRef
12.
Zurück zum Zitat Bellik Y, Teil D (1992) Definitions Terminologiques pour la Communication Multimodale. In: IHM92, Quatrimes Journes sur lIngnierie des Interfaces Homme-Machine Bellik Y, Teil D (1992) Definitions Terminologiques pour la Communication Multimodale. In: IHM92, Quatrimes Journes sur lIngnierie des Interfaces Homme-Machine
13.
Zurück zum Zitat Nigay L, Coutaz J (1997) Multifeature systems: the CARE properties and their impact on software design. In: First international workshop on intelligence and multimodality in multimedia interfaces: research and applications, AAAI Press Nigay L, Coutaz J (1997) Multifeature systems: the CARE properties and their impact on software design. In: First international workshop on intelligence and multimodality in multimedia interfaces: research and applications, AAAI Press
14.
Zurück zum Zitat Kvale K, Warakagoda ND (2010) Multimodal interfaces to mobile terminals a design-for-all approach. In: User interfaces Kvale K, Warakagoda ND (2010) Multimodal interfaces to mobile terminals a design-for-all approach. In: User interfaces
15.
Zurück zum Zitat Bordegoni M, Faconti G, Feiner S, Maybury MT, Rist T, Ruggieri S, Trahanias P, Wilson M (1997) A standard reference model for intelligent multimedia presentation systems. In: Rist et al., pp 477–496 Bordegoni M, Faconti G, Feiner S, Maybury MT, Rist T, Ruggieri S, Trahanias P, Wilson M (1997) A standard reference model for intelligent multimedia presentation systems. In: Rist et al., pp 477–496
16.
Zurück zum Zitat Bellik Y (1995) Interface Multimodales: Concepts, Modles et Architectures. Ph.D. Thesis. University of Paris XI, France Bellik Y (1995) Interface Multimodales: Concepts, Modles et Architectures. Ph.D. Thesis. University of Paris XI, France
17.
Zurück zum Zitat Bouchet J, Nigay L (2004) ICARE: a component-based approach for the design and development of multimodal interfaces. In: CHI extended abstracts, pp. 1325–1328 Bouchet J, Nigay L (2004) ICARE: a component-based approach for the design and development of multimodal interfaces. In: CHI extended abstracts, pp. 1325–1328
18.
Zurück zum Zitat Martin JC (1999) TYCOON: six primitive types of cooperation for observing, evaluating and specifying cooperations. In: AAAI fall, symposium on psychological models of communication in collaborative systems Martin JC (1999) TYCOON: six primitive types of cooperation for observing, evaluating and specifying cooperations. In: AAAI fall, symposium on psychological models of communication in collaborative systems
19.
Zurück zum Zitat Martin JC (1997) Towards intelligent cooperation between modalities. The example of a system enabling multimodal interaction with a map. In: IJCAI-97 workshop on intelligent multimodal systems Martin JC (1997) Towards intelligent cooperation between modalities. The example of a system enabling multimodal interaction with a map. In: IJCAI-97 workshop on intelligent multimodal systems
20.
Zurück zum Zitat McNab T, James DA, Rowlands D (2011) iphone sensor platforms: applications to sports monitoring. Proc Eng 13:507–512CrossRef McNab T, James DA, Rowlands D (2011) iphone sensor platforms: applications to sports monitoring. Proc Eng 13:507–512CrossRef
21.
Zurück zum Zitat Lane ND, Miluzzo E, Lu H, Peebles D, Choudhury T, Campbell AT (2010) A survey of mobile phone sensing. Commun Mag 48(9):140–150CrossRef Lane ND, Miluzzo E, Lu H, Peebles D, Choudhury T, Campbell AT (2010) A survey of mobile phone sensing. Commun Mag 48(9):140–150CrossRef
22.
Zurück zum Zitat Phansalkar N, Kumthekar N, Mulay M, Khedkar S, Shinde GR (2014) Air Gesture Library for Android using Camera. Int J Eng Res Technol (IJERT) Phansalkar N, Kumthekar N, Mulay M, Khedkar S, Shinde GR (2014) Air Gesture Library for Android using Camera. Int J Eng Res Technol (IJERT)
23.
Zurück zum Zitat Elouali N (2014) Approche base de modles pour la construction dapplications mobiles multimodales. Ph.D. thesis, Lille 1 University Elouali N (2014) Approche base de modles pour la construction dapplications mobiles multimodales. Ph.D. thesis, Lille 1 University
24.
Zurück zum Zitat Karlson AKBBB (2006) Understanding single-handed mobile device interaction. Technical report Karlson AKBBB (2006) Understanding single-handed mobile device interaction. Technical report
25.
Zurück zum Zitat Roudaut A (2010) Conception et valuation de techniques dinteraction pour dispositifs mobiles. Ph.D. thesis, Telecom ParisTech, France Roudaut A (2010) Conception et valuation de techniques dinteraction pour dispositifs mobiles. Ph.D. thesis, Telecom ParisTech, France
26.
Zurück zum Zitat Foucault C, Micaux M, Bonnet D, Beaudouin-Lafon M (2014) Spad: a bimanual interaction technique for productivity applications on multi-touch tablets. In: CHI 14 extended abstracts on human factors in computing systems. CHI EA 14. ACM, New York, NY, USA, pp 1879–884 Foucault C, Micaux M, Bonnet D, Beaudouin-Lafon M (2014) Spad: a bimanual interaction technique for productivity applications on multi-touch tablets. In: CHI 14 extended abstracts on human factors in computing systems. CHI EA 14. ACM, New York, NY, USA, pp 1879–884
27.
Zurück zum Zitat Naumann A, Wechsung I, Hurtienne J (2009) Multimodality, inclusive design, and intuitive use. Is prior experience the same as intuition in the context of inclusive design Naumann A, Wechsung I, Hurtienne J (2009) Multimodality, inclusive design, and intuitive use. Is prior experience the same as intuition in the context of inclusive design
28.
Zurück zum Zitat Abdallah El Ali A (2013) Minimal mobile human computer interaction. Ph.D. thesis, University van Amsterdam, Pays-Bas Abdallah El Ali A (2013) Minimal mobile human computer interaction. Ph.D. thesis, University van Amsterdam, Pays-Bas
29.
Zurück zum Zitat Oulasvirta A, Reichel A, Li W, Zhang Y, Bachynskyi M, Vertanen K, Kristensson PO (2013) Improving two-thumb text entry on touchscreen devices. In: SIGCHI conference on human factors in computing systems, CHI 13, pp. 2765–2774. ACM, New York, NY, USA Oulasvirta A, Reichel A, Li W, Zhang Y, Bachynskyi M, Vertanen K, Kristensson PO (2013) Improving two-thumb text entry on touchscreen devices. In: SIGCHI conference on human factors in computing systems, CHI 13, pp. 2765–2774. ACM, New York, NY, USA
30.
Zurück zum Zitat Xiao B, Girand C, Oviatt S (2002) Multimodal integration patterns in children. In: ICSLP02, pp. 629–632 Xiao B, Girand C, Oviatt S (2002) Multimodal integration patterns in children. In: ICSLP02, pp. 629–632
31.
Zurück zum Zitat Xiao B, Lunsford R, Coulston R, Wesson M, Oviatt S (2003) Modeling multimodal integration patterns and performance in seniors: toward adaptive processing of individual differences. In: The fifth international conference on multimodal interfaces. ACM Press, pp. 265–272 Xiao B, Lunsford R, Coulston R, Wesson M, Oviatt S (2003) Modeling multimodal integration patterns and performance in seniors: toward adaptive processing of individual differences. In: The fifth international conference on multimodal interfaces. ACM Press, pp. 265–272
32.
Zurück zum Zitat Dabbish L, Mark G, Gonzalez V (2011) Why do i keep interrupting myself?: environment, habit and self-interruption. In: CHI’11 Dabbish L, Mark G, Gonzalez V (2011) Why do i keep interrupting myself?: environment, habit and self-interruption. In: CHI’11
33.
Zurück zum Zitat Patil P, Sawant K, Desai S, Shinde A (2018) Task trigger: reminder application based on location. Int Res J Eng Technol (IRJET) Patil P, Sawant K, Desai S, Shinde A (2018) Task trigger: reminder application based on location. Int Res J Eng Technol (IRJET)
34.
Zurück zum Zitat Khamis M, Hassib M, Zezschwitz E, Bulling A, Alt F (2017) GazeTouchPIN: protecting sensitive data on mobile devices using secure multimodal authentication. In: Proceedings of the 19th ACM international conference on multimodal interaction (ICMI 2017) Khamis M, Hassib M, Zezschwitz E, Bulling A, Alt F (2017) GazeTouchPIN: protecting sensitive data on mobile devices using secure multimodal authentication. In: Proceedings of the 19th ACM international conference on multimodal interaction (ICMI 2017)
35.
Zurück zum Zitat Bastien JC, Scapin DL (1993) Ergonomic criteria for the evaluation of human-computer interfaces. Doctoral dissertation. Inria. France Bastien JC, Scapin DL (1993) Ergonomic criteria for the evaluation of human-computer interfaces. Doctoral dissertation. Inria. France
36.
Zurück zum Zitat Turk M (2014) Multimodal interaction: a review. Pattern Recognit Lett 36:189–195CrossRef Turk M (2014) Multimodal interaction: a review. Pattern Recognit Lett 36:189–195CrossRef
37.
Zurück zum Zitat Elouali N (2018) How young Algerians interact with their smartphones. In: Third international conference on multimedia information processing CITIM, Algeria Elouali N (2018) How young Algerians interact with their smartphones. In: Third international conference on multimedia information processing CITIM, Algeria
38.
Zurück zum Zitat Bellal Z, Elouali N, Benslimane SM (2017) Une approche de programmation par dmonstration pour lintgration de la multimodalitsous mobile. IHM’17, Poitiers Bellal Z, Elouali N, Benslimane SM (2017) Une approche de programmation par dmonstration pour lintgration de la multimodalitsous mobile. IHM’17, Poitiers
39.
Zurück zum Zitat Bedjaoui M, Elouali N, Benslimane S (2018) User time spent between persuasiveness and usability of social networking mobile applications: a case study of Facebook and YouTube. In: The 16th international conference on advances in mobile computing and multimedia (MoMM) Bedjaoui M, Elouali N, Benslimane S (2018) User time spent between persuasiveness and usability of social networking mobile applications: a case study of Facebook and YouTube. In: The 16th international conference on advances in mobile computing and multimedia (MoMM)
Metadaten
Titel
Time Well Spent with multimodal mobile interactions
verfasst von
Nadia Elouali
Publikationsdatum
22.07.2019
Verlag
Springer International Publishing
Erschienen in
Journal on Multimodal User Interfaces / Ausgabe 4/2019
Print ISSN: 1783-7677
Elektronische ISSN: 1783-8738
DOI
https://doi.org/10.1007/s12193-019-00310-1

Weitere Artikel der Ausgabe 4/2019

Journal on Multimodal User Interfaces 4/2019 Zur Ausgabe