ABSTRACT
In remote collaboration, gaze visualizations are designed to display where collaborators are looking in a shared visual space. This type of gaze-based intervention can improve coordination, however researchers have yet to fully explore different gaze visualization techniques and develop a deeper understanding of the ways in which features of visualizations may interact with task attributes to influence collaborative performance. There are many ways to visualize characteristics of eye movements, such as a path connecting fixation points or a heat map illustrating fixation duration and coverage. In this study, we designed and evaluated three unique gaze visualizations in a remote search task. Our results suggest that the design of gaze visualizations affects performance, coordination, searching behavior, and perceived utility. Additionally, the degree of task coupling further influences the effect of gaze visualizations on performance and coordination. We then reflect on the value of gaze visualizations for remote work and discuss implications for the design of gaze-based interventions.
Supplemental Material
Available for Download
- Reynold Bailey, Ann McNamara, Nisha Sudarsanam, and Cindy Grimm. 2009. Subtle gaze direction. ACM Transactions on Graphics (TOG) 28, 4 (2009), 100. Google ScholarDigital Library
- Ellen Gurman Bard, Robin L Hill, Mary Ellen Foster, and Manabu Arai. 2014. Tuning accessibility of referring expressions in situated dialogue. Language, Cognition and Neuroscience 29, 8 (2014), 928--949.Google ScholarCross Ref
- Kristen Betts. 2009. Lost in translation: Importance of effective communication in online education. Online Journal of Distance Learning Administration 12, 2 (2009).Google Scholar
- Susan E Brennan, Xin Chen, Christopher A Dickinson, Mark B Neider, and Gregory J Zelinsky. 2008. Coordinating cognition: The costs and benefits of shared gaze during collaborative search. Cognition 106, 3 (2008), 1465--1477.Google ScholarCross Ref
- Sarah D'Angelo and Andrew Begel. Improving Communication with Shared Gaze Awareness in Remote Pair Programming. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (2017). ACM. Google ScholarDigital Library
- Sarah D'Angelo and Darren Gergle. Gazed and Confused: Understanding and Designing Shared Gaze for Remote Collaboration. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (2016). ACM, 2492--2496. Google ScholarDigital Library
- Susan R Fussell, Leslie D Setlock, Jie Yang, Jiazhi Ou, Elizabeth Mauer, and Adam DI Kramer. 2004. Gestures over video streams to support remote collaboration on physical tasks. Human-Computer Interaction 19, 3 (2004), 273--309. Google ScholarDigital Library
- Darren Gergle, Robert E Kraut, and Susan R Fussell. 2013. Using visual information for grounding and awareness in collaborative tasks. Human-Computer Interaction 28, 1 (2013), 1--39.Google Scholar
- David M Grayson and Andrew F Monk. 2003. Are you looking at me? Eye contact and desktop video conferencing. ACM Transactions on Computer-Human Interaction (TOCHI) 10, 3 (2003), 221--243. Google ScholarDigital Library
- Zenzi M Griffin and Kathryn Bock. 2000. What the eyes say about speaking. Psychological science 11, 4 (2000), 274--279.Google Scholar
- Karl Gyllstrom and David Stotts. 2005. Facetop: Integrated semi-transparent video for enhanced natural pointing in shared screen collaboration. May 15 (2005), 1--10.Google Scholar
- Kenneth Holmqvist, Marcus Nyström, and Fiona Mulvey. 2012. Eye tracker data quality: what it is and how to measure it. In Proceedings of the symposium on eye tracking research and applications. ACM, 45--52. Google ScholarDigital Library
- Hiroshi Ishii and Minoru Kobayashi. 1992. ClearBoard: a seamless medium for shared drawing and conversation with eye contact. In Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, 525--532. Google ScholarDigital Library
- Halszka Jarodzka, Katharina Scheiter, Peter Gerjets, and Tamara Van Gog. In the eyes of the beholder: How experts and novices interpret dynamic stimuli. In Learning and Instruction (2010), Vol. 20. 146--154.Google Scholar
- Halszka Jarodzka, Tamara van Gog, Michael Dorr, Katharina Scheiter, and Peter Gerjets. 2013. Learning to see: Guiding students' attention via a model's eye movements fosters learning. Learning and Instruction 25 (2013), 62--70.Google ScholarCross Ref
- Steven Johnson, Irene Rae, Bilge Mutlu, and Leila Takayama. 2015. Can you see me now?: How field of view affects collaboration in robotic telepresence. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM, 2397--2406. Google ScholarDigital Library
- Robert E. Kraut, Darren Gergle, and Susan R. Fussell. The use of visual information in shared visual spaces: Informing the development of virtual co-presence. In Proceedings of the 2002 ACM conference on Computer supported cooperative work (2002). 31--40. Google ScholarDigital Library
- Jerry Li, Mia Manavalan, Sarah D'Angelo, and Darren Gergle. Designing Shared Gaze Awareness for Remote Collaboration. In Proceedings of the 19th ACM Conference on Computer Supported Cooperative Work and Social Computing Companion (2016). ACM, 325--328. Google ScholarDigital Library
- Andrew F Monk and Caroline Gale. 2002. A look is worth a thousand words: Full gaze awareness in video-mediated conversation. Discourse Processes 33, 3 (2002), 257--278.Google ScholarCross Ref
- Joshua Newn, Eduardo Velloso, Fraser Allison, Yomna Abdelrahman, and Frank Vetere. 2017. Evaluating Real-Time Gaze Representations to Infer Intentions in Competitive Turn-Based Strategy Games. In Proceedings of the Annual Symposium on Computer-Human Interaction in Play. ACM, 541--552. Google ScholarDigital Library
- Diederick C Niehorster, Tim HW Cornelissen, Kenneth Holmqvist, Ignace TC Hooge, and Roy S Hessels. 2017. What to expect from your remote eye-tracker when participants are unrestrained. Behavior Research Methods (2017), 1--15.Google Scholar
- Pernilla Qvarfordt, David Beymer, and Shumin Zhai. 2005. Realtourist--a study of augmenting human-human and human-computer dialogue with eye-gaze overlay. Human-Computer Interaction-INTERACT 2005 (2005), 767--780. Google ScholarDigital Library
- Daniel C Richardson and Rick Dale. 2005. Looking to understand: The coupling between speakers' and listeners' eye movements and its relationship to discourse comprehension. Cognitive science 29, 6 (2005), 1045--1060.Google Scholar
- Daniel C Richardson, Rick Dale, and Natasha Z Kirkham. 2007. The art of conversation is coordination common ground and the coupling of eye movements during dialogue. Psychological science 18, 5 (2007), 407--413.Google Scholar
- John Sall, Ann Lehman, Mia L Stephens, and Lee Creighton. 2012. JMP start statistics: a guide to statistics and data analysis using JMP. Sas Institute. Google ScholarDigital Library
- Bertrand Schneider and Roy Pea. 2013. Real-time mutual gaze perception enhances collaborative learning and collaboration quality. International Journal of Computer-supported collaborative learning 8, 4 (2013), 375--397.Google ScholarCross Ref
- Kshitij Sharma, Sarah D'Angelo, Darren Gergle, and Pierre Dillenbourg. Visual Augmentation of Deictic Gestures in MOOC videos. In 12th International Conference of the Learning Sciences (2016) (ICLS'16). ACM.Google Scholar
- Srinivas Sridharan, Reynold Bailey, Ann McNamara, and Cindy Grimm. 2012. Subtle gaze manipulation for improved mammography training. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 75--82. Google ScholarDigital Library
- Randy Stein and Susan E Brennan. Another person's eye gaze as a cue in solving programming problems. In Proceedings of the 6th international conference on Multimodal interfaces (2004). ACM, 9--15. Google ScholarDigital Library
- Stephen A Sweet and Karen Grace-Martin. 1999. Data analysis with SPSS. Vol. 1. Allyn & Bacon Boston, MA.Google Scholar
- Yanxia Zhang, Ken Pfeuffer, Ming Ki Chong, Jason Alexander, Andreas Bulling, and Hans Gellersen. 2017. Look together: using gaze for assisting co-located collaborative search. Personal and Ubiquitous Computing 21, 1 (2017), 173--186. Google ScholarDigital Library
Index Terms
- An Eye For Design: Gaze Visualizations for Remote Collaborative Work
Recommendations
Can Eye Help You?: Effects of Visualizing Eye Fixations on Remote Collaboration Scenarios for Physical Tasks
CHI '16: Proceedings of the 2016 CHI Conference on Human Factors in Computing SystemsIn this work, we investigate how remote collaboration between a local worker and a remote collaborator will change if eye fixations of the collaborator are presented to the worker. We track the collaborator's points of gaze on a monitor screen ...
Iris: Gaze Visualization Design Made Easy
CHI EA '18: Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing SystemsEye movements contain a wealth of information about how we process visual information. However, typical visual representations of gaze only reflect a single feature such as the current fixation point. As eye tracking technology becomes more affordable ...
Iris: a tool for designing contextually relevant gaze visualizations
ETRA '19: Proceedings of the 11th ACM Symposium on Eye Tracking Research & ApplicationsAdvances in eye tracking technology have enabled new interaction techniques and gaze-based applications. However, the techniques for visualizing gaze information have remained relatively unchanged. We developed Iris, a tool to support the design of ...
Comments