ABSTRACT
Capturing and analyzing the detailed eye movements of a user while reading a web page can reveal much about the ways in which web reading occurs. The WebGazeAnalyzer system described here is a remote-camera system, requiring no invasive head-mounted apparatus, giving test subjects a normal web use experience when performing web-based tasks. While many such systems have been used in the past to collect eye gaze data, WebGazeAnalyzer brings together several techniques for efficiently collecting, analyzing and re-analyzing eye gaze data. We briefly describe techniques for overcoming the inherent inaccuracies of such apparatus, illustrating how we capture and analyze eye gaze data for commercial web design problems. Techniques developed here include methods to group fixations along lines of text, and reading analysis to measure reading speed, regressions, and coverage of web page text.
- Heer, J. Capturing and Analyzing the Web Experience. In CHI 2002 Workshop on Automatic Capture, Representation, and Analysis of User Activity, Position Papers, ACM Press (2002).Google Scholar
- Chi, E.H., Rosien, A., and Heer, J. LumberJack: Intelligent Discovery and Analysis of Web User Traffic Composition. In ACM-SIGKDD Workshop on Web Mining for Usage Patterns and User Profiles, (2002).Google Scholar
- Card, S.K., Pirolli, P., Van Der Wege, M., Morrison, J.B., Reeder, R.W., Schraedley, P.K., Boshart, J. Information Scent as a Driver of Web Behavior Graphs: Results of a Protocol Analysis Method for Web Usability. In Proc. CHI 2001, ACM Press (2001), pp. 498--505. Google ScholarDigital Library
- Reeder, R. W., Pirolli, P., Card, S. K. WebEyeMapper and WebLogger: tools for analyzing eye tracking data collected in web-use studies. In Proc. CHI 2001. Extended abstracts on Human factors in computing systems. (2001) Google ScholarDigital Library
- Chi, E.H., Rosien, A., Supattanasiri, G., Williams, A., Royer, C., Chow, C., Robles, E., Dalal, B., Chen, J. and Cousins, S. The Bloodhound Project: Automating Discovery of Web Usability Issues using the InfoScent Simulator. In Proc. CHI 2003, ACM Press (2003). Google ScholarDigital Library
- Poynter Institute and Eyetools, Inc. Eyetrack III: Online News Consumer Behavior in the Age of Multimedia. http://www.poynterextra.org/eyetrack2004/index.htmGoogle Scholar
- Burke, M., Gorman, N., Nilsen, E., Hornof, A. Banner Ads Hinder Visual Search and Are Forgotten. In Proc. CHI 2004, ACM Press (2004). Google ScholarDigital Library
- Halverson, T., and Hornof, A. Link Colors Guide a Search. In Proc. CHI 2004, ACM Press (2004). Google ScholarDigital Library
- EyeLink II eye tracker, SR Research. http://www.eyelinkinfo.com/Google Scholar
- Tobii 1750 Eye-tracker, http://www.tobii.se/Google Scholar
- Clearview Eye Gaze Analysis Software, Tobii Technology, http://www.tobii.se/Google Scholar
- Eyetools, http://www.eyetools.comGoogle Scholar
- Rayner, K. and Pollatsek, A. The Psychology of Reading. Lawrence Erlbaum Associates. Hillsdale, NJ. 1989.Google Scholar
- Salvucci, D.D. and Goldberg, J.H. Identifying Fixations and Saccades in Eye-Tracking Protocols. Proc. of the Symposium on Eye Tracking Research & Applications (ETRA), 2000. Google ScholarDigital Library
- Hornof, A.J., Halverson T., Cleaning up systematic error in eye-tracking data by using required fixation locations. Behavior Research Methods, Instruments & Computers, vol. 34, no. 4, pp. 592--604 (2002).Google ScholarCross Ref
- Li, S.F., Spiteri, M.S., Bates, J., and Hopper, A., Capturing and Indexing Computer-based Activities with Virtual Network Computing. ACM Symposium on Applied Computing, 601--603 (2000). Google ScholarDigital Library
- Lankford, C., Gazetracker: software designed to facilitate eye movement analysis. Proc. of the Symposium on Eye Tracking Research & Applications (ETRA), 2000. Google ScholarDigital Library
Index Terms
- WebGazeAnalyzer: a system for capturing and analyzing web reading behavior using eye gaze
Recommendations
The reading assistant: eye gaze triggered auditory prompting for reading remediation
UIST '00: Proceedings of the 13th annual ACM symposium on User interface software and technologyEye, Head and Torso Coordination During Gaze Shifts in Virtual Reality
Humans perform gaze shifts naturally through a combination of eye, head and body movements. Although gaze has been long studied as input modality for interaction, this has previously ignored the coordination of the eyes, head and body. This article ...
Eye-Hand Behavior in Human-Robot Shared Manipulation
HRI '18: Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot InteractionShared autonomy systems enhance people's abilities to perform activities of daily living using robotic manipulators. Recent systems succeed by first identifying their operators' intentions, typically by analyzing the user's joystick input. To enhance ...
Comments