Skip to main content

2018 | OriginalPaper | Buchkapitel

bRIGHT – Workstations of the Future and Leveraging Contextual Models

verfasst von : Rukman Senanayake, Grit Denker, Patrick Lincoln

Erschienen in: Human Interface and the Management of Information. Interaction, Visualization, and Analytics

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Experimenting with futuristic computer workstation design and specifically tailored application models can yield useful insights and result in exciting ways to increase efficiency, effectiveness, and satisfaction for computer users. Designing and building a computer workstation that can track a user’s gaze; sense proximity to the touch surface; and support multi-touch, face recognition etc meant overcoming some unique technological challenges. Coupled with extensions to commonly used applications to report user interactions in a meaningful way, the workstation will allow the development of a rich contextual user model that is accurate enough to enable benefits, such as contextual filtering, task automation, contextual auto-fill, and improved understanding of team collaborations. SRI’s bRIGHT workstation was designed and built to explore these research avenues and investigate how such a context model can be built, identify the key implications in designing an application model that best serves these goals, and discover other related factors. This paper conjectures future research that would support the development of a collaborative context model that could leverage similar benefits for groups of users.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
2.
Zurück zum Zitat Han, J.Y.: Low-cost multi-touch sensing through frustrated total internal reflection. In: Proceedings of the 18th Annual ACM Symposium on User Interface Software and Technology. ACM (2005) Han, J.Y.: Low-cost multi-touch sensing through frustrated total internal reflection. In: Proceedings of the 18th Annual ACM Symposium on User Interface Software and Technology. ACM (2005)
3.
Zurück zum Zitat Bulling, A., Gellersen, H.: Toward mobile eye-based human computer interaction. IEEE Pervasive Comput. 9(4), 8–12 (2010)CrossRef Bulling, A., Gellersen, H.: Toward mobile eye-based human computer interaction. IEEE Pervasive Comput. 9(4), 8–12 (2010)CrossRef
4.
Zurück zum Zitat Rantanen, V., Vanhala, T., Tuisku, O., Niemenlehto, P.-H., Verho, J., Surakka, V., Juhola, M., Lekkala, J.: A wearable, wireless gaze tracker with integrated selection command source for human-computer interaction. IEEE Trans. Inf Technol. Biomed. 15(5), 795–801 (2011)CrossRef Rantanen, V., Vanhala, T., Tuisku, O., Niemenlehto, P.-H., Verho, J., Surakka, V., Juhola, M., Lekkala, J.: A wearable, wireless gaze tracker with integrated selection command source for human-computer interaction. IEEE Trans. Inf Technol. Biomed. 15(5), 795–801 (2011)CrossRef
5.
Zurück zum Zitat Corcoran, P.M., Nanu, F., Petrescu, S., Bigioi, P.: Real-time eye gaze tracking for gaming design and consumer electronics systems, pp. 347–355 (2012)CrossRef Corcoran, P.M., Nanu, F., Petrescu, S., Bigioi, P.: Real-time eye gaze tracking for gaming design and consumer electronics systems, pp. 347–355 (2012)CrossRef
6.
Zurück zum Zitat Bolmont, M., Cacioppo, J.T., Cacioppo, S.: Love is in the gaze: an eye-tracking study of love and sexual desire. Psychol. Sci. 25, 1748–1756 (2014)CrossRef Bolmont, M., Cacioppo, J.T., Cacioppo, S.: Love is in the gaze: an eye-tracking study of love and sexual desire. Psychol. Sci. 25, 1748–1756 (2014)CrossRef
7.
Zurück zum Zitat Judd, T., Ehinger, K., Torralba, A.: Learning to predict where humans look. In: ICCV, pp. 2106–2113 (2009) Judd, T., Ehinger, K., Torralba, A.: Learning to predict where humans look. In: ICCV, pp. 2106–2113 (2009)
8.
Zurück zum Zitat Senju, A., Johnson, M.H.: The eye contact effect: mechanisms and development. Trends Cogn. Sci. 13(3), 127–134 (2009)CrossRef Senju, A., Johnson, M.H.: The eye contact effect: mechanisms and development. Trends Cogn. Sci. 13(3), 127–134 (2009)CrossRef
9.
Zurück zum Zitat Tylén, K., Allen, M., Hunter, B.K., Roepstorff, A.: Interaction vs. observation: distinctive modes of social cognition in human brain and behavior? A combined fMRI and eye-tracking study. Front. Hum. Neurosci. 6, 331 (2012)CrossRef Tylén, K., Allen, M., Hunter, B.K., Roepstorff, A.: Interaction vs. observation: distinctive modes of social cognition in human brain and behavior? A combined fMRI and eye-tracking study. Front. Hum. Neurosci. 6, 331 (2012)CrossRef
10.
Zurück zum Zitat Kochukhova, O., Gredeba, G.: Preverbal infants anticipate that food will be brought to the mouth: an eye tracking study of manual feeding and flying spoons. Child Dev. 81(6), 1729–1738 (2010)CrossRef Kochukhova, O., Gredeba, G.: Preverbal infants anticipate that food will be brought to the mouth: an eye tracking study of manual feeding and flying spoons. Child Dev. 81(6), 1729–1738 (2010)CrossRef
11.
Zurück zum Zitat Kano, F., Tomonaga, M.: How chimpanzees look at pictures: a comparative eye-tracking study. Proc. Biol. Sci. 276(1664), 1949–1955 (2009)CrossRef Kano, F., Tomonaga, M.: How chimpanzees look at pictures: a comparative eye-tracking study. Proc. Biol. Sci. 276(1664), 1949–1955 (2009)CrossRef
12.
Zurück zum Zitat Bergasa, L.M., Nuevo, J., Sotelo, M.A., Barea, R., Lopez, M.E.: Real-time system for monitoring driver vigilance. IEEE Trans. Intell. Transp. Syst. 7(1), 63–77 (2006)CrossRef Bergasa, L.M., Nuevo, J., Sotelo, M.A., Barea, R., Lopez, M.E.: Real-time system for monitoring driver vigilance. IEEE Trans. Intell. Transp. Syst. 7(1), 63–77 (2006)CrossRef
13.
Zurück zum Zitat Qi, Y., Wang, Z., Huang, Y.: A non-contact eye-gaze tracking system for human computer interaction. In: 2007 International Conference on Wavelet Analysis Pattern Recognition, pp. 68–72, November 2007 Qi, Y., Wang, Z., Huang, Y.: A non-contact eye-gaze tracking system for human computer interaction. In: 2007 International Conference on Wavelet Analysis Pattern Recognition, pp. 68–72, November 2007
14.
Zurück zum Zitat Hennessey, C., Lawrence, P.: Noncontact binocular eye-gaze tracking for point-of-gaze estimation in three dimensions. IEEE Trans. Biomed. Eng. 56(3), 790–799 (2009)CrossRef Hennessey, C., Lawrence, P.: Noncontact binocular eye-gaze tracking for point-of-gaze estimation in three dimensions. IEEE Trans. Biomed. Eng. 56(3), 790–799 (2009)CrossRef
15.
Zurück zum Zitat Iqbal, N., Lee, H., Lee, S.-Y.: Smart user interface for mobile consumer devices using model-based eye-gaze estimation. IEEE Trans. Consum. Electron. 59(1), 161–166 (2013)CrossRef Iqbal, N., Lee, H., Lee, S.-Y.: Smart user interface for mobile consumer devices using model-based eye-gaze estimation. IEEE Trans. Consum. Electron. 59(1), 161–166 (2013)CrossRef
16.
Zurück zum Zitat Guestrin, E.D., Eizenman, M.: General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Trans. Biomed. Eng. 53(6), 1124–1133 (2006)CrossRef Guestrin, E.D., Eizenman, M.: General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Trans. Biomed. Eng. 53(6), 1124–1133 (2006)CrossRef
17.
Zurück zum Zitat Hennessey, C., Noureddin, B., Lawrence, P.: Fixation precision in high-speed noncontact eye-gaze tracking. IEEE Trans. Syst. Man. Cybern. Part B (Cybern.) 38(2), 289–298 (2008)CrossRef Hennessey, C., Noureddin, B., Lawrence, P.: Fixation precision in high-speed noncontact eye-gaze tracking. IEEE Trans. Syst. Man. Cybern. Part B (Cybern.) 38(2), 289–298 (2008)CrossRef
18.
Zurück zum Zitat Nawaz, T., Mian, M., Habib, H.: Infotainment devices control by eye gaze and gesture recognition fusion. IEEE Trans. Consum. Electron. 54(2), 277–282 (2008)CrossRef Nawaz, T., Mian, M., Habib, H.: Infotainment devices control by eye gaze and gesture recognition fusion. IEEE Trans. Consum. Electron. 54(2), 277–282 (2008)CrossRef
19.
Zurück zum Zitat Asteriadis, S., Karpouzis, K., Kollias, S.: Visual focus of attention in non-calibrated environments using gaze estimation. Int. J. Comput. Vis. 107(3), 293–316 (2013)MathSciNetCrossRef Asteriadis, S., Karpouzis, K., Kollias, S.: Visual focus of attention in non-calibrated environments using gaze estimation. Int. J. Comput. Vis. 107(3), 293–316 (2013)MathSciNetCrossRef
20.
Zurück zum Zitat Zhu, Z., Ji, Q.: Novel eye gaze tracking techniques under natural head movement. IEEE Trans. Biomed. Eng. 54(12), 2246–2260 (2007)CrossRef Zhu, Z., Ji, Q.: Novel eye gaze tracking techniques under natural head movement. IEEE Trans. Biomed. Eng. 54(12), 2246–2260 (2007)CrossRef
21.
Zurück zum Zitat Xia, D., Ruan, Z.: IR image based eye gaze estimation. In: Eighth ACIS International Conference Software Engineering Artificial Intelligence Networking, Parallel/Distributed Computing (SNPD 2007), vol. 1, pp. 220–224, July 2007 Xia, D., Ruan, Z.: IR image based eye gaze estimation. In: Eighth ACIS International Conference Software Engineering Artificial Intelligence Networking, Parallel/Distributed Computing (SNPD 2007), vol. 1, pp. 220–224, July 2007
22.
Zurück zum Zitat Nguyen, Q.X., Jo, S.: Electric wheelchair control using head pose free eye-gaze tracker. Electron. Lett. 48(13), 750 (2012)CrossRef Nguyen, Q.X., Jo, S.: Electric wheelchair control using head pose free eye-gaze tracker. Electron. Lett. 48(13), 750 (2012)CrossRef
23.
Zurück zum Zitat Rae, J.P., Steptoe, W., Roberts, D.J.: Some implications of eye gaze behavior and perception for the design of immersive telecommunication systems. In: 2011 IEEE/ACM 15th International Symposium on Distributed Simulation Real Time Applications, pp. 108–114, September 2011 Rae, J.P., Steptoe, W., Roberts, D.J.: Some implications of eye gaze behavior and perception for the design of immersive telecommunication systems. In: 2011 IEEE/ACM 15th International Symposium on Distributed Simulation Real Time Applications, pp. 108–114, September 2011
24.
Zurück zum Zitat Panwar, P., Sarcar, S., Samanta, D.: EyeBoard: a fast and accurate eye gaze-based text entry system. In: 2012 4th International Conference on Intelligent Human Computer Interaction, pp. 1–8, December 2012 Panwar, P., Sarcar, S., Samanta, D.: EyeBoard: a fast and accurate eye gaze-based text entry system. In: 2012 4th International Conference on Intelligent Human Computer Interaction, pp. 1–8, December 2012
25.
Zurück zum Zitat Ji, Q., Zhu, Z.: Eye and gaze tracking for interactive graphic display. In: Proceedings of the 2nd International Symposium on Smart Graphics. ACM (2002) Ji, Q., Zhu, Z.: Eye and gaze tracking for interactive graphic display. In: Proceedings of the 2nd International Symposium on Smart Graphics. ACM (2002)
26.
Zurück zum Zitat Dahlback, N., Jönsson, A., Ahrenberg, L.: Wizard of Oz studies - why and how. Knowl.-Based Syst. 6(4), 258–266 (1993)CrossRef Dahlback, N., Jönsson, A., Ahrenberg, L.: Wizard of Oz studies - why and how. Knowl.-Based Syst. 6(4), 258–266 (1993)CrossRef
27.
Zurück zum Zitat Cohen, P.R., Johnston, M., McGee, D., Oviatt, S.: The efficiency of multimodal interaction: a case study. In: International Conference on Spoken Language Processing (ICSLP), Sydney, Australia (1998) Cohen, P.R., Johnston, M., McGee, D., Oviatt, S.: The efficiency of multimodal interaction: a case study. In: International Conference on Spoken Language Processing (ICSLP), Sydney, Australia (1998)
28.
Zurück zum Zitat Oviatt, S., Lunsford, R., Coulston, R.: Individual differences in multimodal integration patterns: what are they and why do they exist? In: Conference on Human Factors in Computing Systems (CHI), New York, USA (2005) Oviatt, S., Lunsford, R., Coulston, R.: Individual differences in multimodal integration patterns: what are they and why do they exist? In: Conference on Human Factors in Computing Systems (CHI), New York, USA (2005)
29.
Zurück zum Zitat Ruiz, N., Chen, F., Oviatt, S.: Multimodal input. In: Thiran, J.-P., Marques, F., Bourlard, H. (eds.) Multimodal Signal Processing, pp. 231–255 (2010). Chapter 12CrossRef Ruiz, N., Chen, F., Oviatt, S.: Multimodal input. In: Thiran, J.-P., Marques, F., Bourlard, H. (eds.) Multimodal Signal Processing, pp. 231–255 (2010). Chapter 12CrossRef
30.
Zurück zum Zitat Turk, M.: Multimodal interaction: a review. Pattern Recogn. Lett. 36, 189–195 (2014)CrossRef Turk, M.: Multimodal interaction: a review. Pattern Recogn. Lett. 36, 189–195 (2014)CrossRef
31.
Zurück zum Zitat Perzylo, A., Somani, N., Profanter, S., Rickert, M., Knoll, A.: Toward efficient robot teach-in and semantic process descriptions for small lot sizes. In: Robotics: Science and Systems (RSS), Workshop on Combining AI Reasoning and Cognitive Science with Robotics, Rome, Italy (2015) Perzylo, A., Somani, N., Profanter, S., Rickert, M., Knoll, A.: Toward efficient robot teach-in and semantic process descriptions for small lot sizes. In: Robotics: Science and Systems (RSS), Workshop on Combining AI Reasoning and Cognitive Science with Robotics, Rome, Italy (2015)
32.
Zurück zum Zitat Akan, B., Ameri, A., Cürüklü, B., Asplund, L.: Intuitive industrial robot programming through incremental multimodal language and augmented reality. In: International Conference on Robotics and Automation (ICRA), Shanghai, China (2011) Akan, B., Ameri, A., Cürüklü, B., Asplund, L.: Intuitive industrial robot programming through incremental multimodal language and augmented reality. In: International Conference on Robotics and Automation (ICRA), Shanghai, China (2011)
33.
Zurück zum Zitat Stenmark, M., Nugues, P.: Natural language programming of industrial robots. In: International Symposium on Robotics (ISR), Seoul, Korea (2013) Stenmark, M., Nugues, P.: Natural language programming of industrial robots. In: International Symposium on Robotics (ISR), Seoul, Korea (2013)
34.
Zurück zum Zitat Stenmark, M., Malec, J.: Describing constraint-based assembly tasks in unstructured natural language. In: World Congress of the International Federation of Automatic Control (IFAC) (2014)CrossRef Stenmark, M., Malec, J.: Describing constraint-based assembly tasks in unstructured natural language. In: World Congress of the International Federation of Automatic Control (IFAC) (2014)CrossRef
Metadaten
Titel
bRIGHT – Workstations of the Future and Leveraging Contextual Models
verfasst von
Rukman Senanayake
Grit Denker
Patrick Lincoln
Copyright-Jahr
2018
DOI
https://doi.org/10.1007/978-3-319-92043-6_29

Neuer Inhalt