skip to main content
10.1145/3173574.3173923acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

An Eye For Design: Gaze Visualizations for Remote Collaborative Work

Published:21 April 2018Publication History

ABSTRACT

In remote collaboration, gaze visualizations are designed to display where collaborators are looking in a shared visual space. This type of gaze-based intervention can improve coordination, however researchers have yet to fully explore different gaze visualization techniques and develop a deeper understanding of the ways in which features of visualizations may interact with task attributes to influence collaborative performance. There are many ways to visualize characteristics of eye movements, such as a path connecting fixation points or a heat map illustrating fixation duration and coverage. In this study, we designed and evaluated three unique gaze visualizations in a remote search task. Our results suggest that the design of gaze visualizations affects performance, coordination, searching behavior, and perceived utility. Additionally, the degree of task coupling further influences the effect of gaze visualizations on performance and coordination. We then reflect on the value of gaze visualizations for remote work and discuss implications for the design of gaze-based interventions.

Skip Supplemental Material Section

Supplemental Material

pn3145.mp4

mp4

176.7 MB

References

  1. Reynold Bailey, Ann McNamara, Nisha Sudarsanam, and Cindy Grimm. 2009. Subtle gaze direction. ACM Transactions on Graphics (TOG) 28, 4 (2009), 100. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Ellen Gurman Bard, Robin L Hill, Mary Ellen Foster, and Manabu Arai. 2014. Tuning accessibility of referring expressions in situated dialogue. Language, Cognition and Neuroscience 29, 8 (2014), 928--949.Google ScholarGoogle ScholarCross RefCross Ref
  3. Kristen Betts. 2009. Lost in translation: Importance of effective communication in online education. Online Journal of Distance Learning Administration 12, 2 (2009).Google ScholarGoogle Scholar
  4. Susan E Brennan, Xin Chen, Christopher A Dickinson, Mark B Neider, and Gregory J Zelinsky. 2008. Coordinating cognition: The costs and benefits of shared gaze during collaborative search. Cognition 106, 3 (2008), 1465--1477.Google ScholarGoogle ScholarCross RefCross Ref
  5. Sarah D'Angelo and Andrew Begel. Improving Communication with Shared Gaze Awareness in Remote Pair Programming. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (2017). ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Sarah D'Angelo and Darren Gergle. Gazed and Confused: Understanding and Designing Shared Gaze for Remote Collaboration. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (2016). ACM, 2492--2496. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Susan R Fussell, Leslie D Setlock, Jie Yang, Jiazhi Ou, Elizabeth Mauer, and Adam DI Kramer. 2004. Gestures over video streams to support remote collaboration on physical tasks. Human-Computer Interaction 19, 3 (2004), 273--309. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Darren Gergle, Robert E Kraut, and Susan R Fussell. 2013. Using visual information for grounding and awareness in collaborative tasks. Human-Computer Interaction 28, 1 (2013), 1--39.Google ScholarGoogle Scholar
  9. David M Grayson and Andrew F Monk. 2003. Are you looking at me? Eye contact and desktop video conferencing. ACM Transactions on Computer-Human Interaction (TOCHI) 10, 3 (2003), 221--243. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Zenzi M Griffin and Kathryn Bock. 2000. What the eyes say about speaking. Psychological science 11, 4 (2000), 274--279.Google ScholarGoogle Scholar
  11. Karl Gyllstrom and David Stotts. 2005. Facetop: Integrated semi-transparent video for enhanced natural pointing in shared screen collaboration. May 15 (2005), 1--10.Google ScholarGoogle Scholar
  12. Kenneth Holmqvist, Marcus Nyström, and Fiona Mulvey. 2012. Eye tracker data quality: what it is and how to measure it. In Proceedings of the symposium on eye tracking research and applications. ACM, 45--52. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Hiroshi Ishii and Minoru Kobayashi. 1992. ClearBoard: a seamless medium for shared drawing and conversation with eye contact. In Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, 525--532. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Halszka Jarodzka, Katharina Scheiter, Peter Gerjets, and Tamara Van Gog. In the eyes of the beholder: How experts and novices interpret dynamic stimuli. In Learning and Instruction (2010), Vol. 20. 146--154.Google ScholarGoogle Scholar
  15. Halszka Jarodzka, Tamara van Gog, Michael Dorr, Katharina Scheiter, and Peter Gerjets. 2013. Learning to see: Guiding students' attention via a model's eye movements fosters learning. Learning and Instruction 25 (2013), 62--70.Google ScholarGoogle ScholarCross RefCross Ref
  16. Steven Johnson, Irene Rae, Bilge Mutlu, and Leila Takayama. 2015. Can you see me now?: How field of view affects collaboration in robotic telepresence. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM, 2397--2406. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Robert E. Kraut, Darren Gergle, and Susan R. Fussell. The use of visual information in shared visual spaces: Informing the development of virtual co-presence. In Proceedings of the 2002 ACM conference on Computer supported cooperative work (2002). 31--40. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Jerry Li, Mia Manavalan, Sarah D'Angelo, and Darren Gergle. Designing Shared Gaze Awareness for Remote Collaboration. In Proceedings of the 19th ACM Conference on Computer Supported Cooperative Work and Social Computing Companion (2016). ACM, 325--328. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Andrew F Monk and Caroline Gale. 2002. A look is worth a thousand words: Full gaze awareness in video-mediated conversation. Discourse Processes 33, 3 (2002), 257--278.Google ScholarGoogle ScholarCross RefCross Ref
  20. Joshua Newn, Eduardo Velloso, Fraser Allison, Yomna Abdelrahman, and Frank Vetere. 2017. Evaluating Real-Time Gaze Representations to Infer Intentions in Competitive Turn-Based Strategy Games. In Proceedings of the Annual Symposium on Computer-Human Interaction in Play. ACM, 541--552. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Diederick C Niehorster, Tim HW Cornelissen, Kenneth Holmqvist, Ignace TC Hooge, and Roy S Hessels. 2017. What to expect from your remote eye-tracker when participants are unrestrained. Behavior Research Methods (2017), 1--15.Google ScholarGoogle Scholar
  22. Pernilla Qvarfordt, David Beymer, and Shumin Zhai. 2005. Realtourist--a study of augmenting human-human and human-computer dialogue with eye-gaze overlay. Human-Computer Interaction-INTERACT 2005 (2005), 767--780. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Daniel C Richardson and Rick Dale. 2005. Looking to understand: The coupling between speakers' and listeners' eye movements and its relationship to discourse comprehension. Cognitive science 29, 6 (2005), 1045--1060.Google ScholarGoogle Scholar
  24. Daniel C Richardson, Rick Dale, and Natasha Z Kirkham. 2007. The art of conversation is coordination common ground and the coupling of eye movements during dialogue. Psychological science 18, 5 (2007), 407--413.Google ScholarGoogle Scholar
  25. John Sall, Ann Lehman, Mia L Stephens, and Lee Creighton. 2012. JMP start statistics: a guide to statistics and data analysis using JMP. Sas Institute. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Bertrand Schneider and Roy Pea. 2013. Real-time mutual gaze perception enhances collaborative learning and collaboration quality. International Journal of Computer-supported collaborative learning 8, 4 (2013), 375--397.Google ScholarGoogle ScholarCross RefCross Ref
  27. Kshitij Sharma, Sarah D'Angelo, Darren Gergle, and Pierre Dillenbourg. Visual Augmentation of Deictic Gestures in MOOC videos. In 12th International Conference of the Learning Sciences (2016) (ICLS'16). ACM.Google ScholarGoogle Scholar
  28. Srinivas Sridharan, Reynold Bailey, Ann McNamara, and Cindy Grimm. 2012. Subtle gaze manipulation for improved mammography training. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 75--82. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Randy Stein and Susan E Brennan. Another person's eye gaze as a cue in solving programming problems. In Proceedings of the 6th international conference on Multimodal interfaces (2004). ACM, 9--15. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Stephen A Sweet and Karen Grace-Martin. 1999. Data analysis with SPSS. Vol. 1. Allyn & Bacon Boston, MA.Google ScholarGoogle Scholar
  31. Yanxia Zhang, Ken Pfeuffer, Ming Ki Chong, Jason Alexander, Andreas Bulling, and Hans Gellersen. 2017. Look together: using gaze for assisting co-located collaborative search. Personal and Ubiquitous Computing 21, 1 (2017), 173--186. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. An Eye For Design: Gaze Visualizations for Remote Collaborative Work

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHI '18: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems
      April 2018
      8489 pages
      ISBN:9781450356206
      DOI:10.1145/3173574

      Copyright © 2018 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 21 April 2018

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      CHI '18 Paper Acceptance Rate666of2,590submissions,26%Overall Acceptance Rate6,199of26,314submissions,24%

      Upcoming Conference

      CHI '24
      CHI Conference on Human Factors in Computing Systems
      May 11 - 16, 2024
      Honolulu , HI , USA

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader