ABSTRACT
Presentation aids, such as the laser pointer, are commonly used in lectures and public speeches. Their effect on the audience has not been properly studied. We present an experiment that compares several pointer alternatives. One of them is GazeLaser, a new solution that does not need a manually operated pointer, but is based on the lecturers' gaze. It fares well in comparison, but comes second to the pointing tool available in PowerPoint. The experiment brings up issues that need to be taken into account when developing GazeLaser further.
- Roman Bednarik. 2012. Expertise-dependent visual attention strategies develop over time during debugging with multiple code representations. International Journal of Human-Computer Studies 70, 2: 143--155. http://dx.doi.org/10.1016/j.ijhcs.2011.09.003 Google ScholarDigital Library
- Yan Chen and Alastair Gale. 2010. Using eye gaze in intelligent interactive imaging training. In Proceedings of the 2010 workshop on Eye gaze in intelligent human machine interaction (EGIHMI '10), 41--44. http://dx.doi.org/10.1145/2002333.2002340 Google ScholarDigital Library
- Andrew T. Duchowski, Nathan Cournia, and Hunter Murphy. 2004. Gaze-contingent displays: a review. CyberPsychology & Behavior 7, 6: 621--634.Google ScholarCross Ref
- Azam Khan, Justin Matejka, George Fitzmaurice, and Gordon Kurtenbach. 2005. Spotlight: directing users' attention on large displays. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '05), 791--798. http://dx.doi.org/10.1145/1054972.1055082 Google ScholarDigital Library
- I. Scott MacKenzie and Shaidah Jusoh. 2001. An evaluation of two input devices for remote pointing. In Proceedings of the 8th IFIP International Conference on Engineering for Human-Computer Interaction (EHCI '01). Springer-Verlag, London, UK, UK, 235--250. Google ScholarDigital Library
- Sergey Matveyev, Martin Göbel, and Pavel Frolov. 2003. Laser pointer interaction with hand tremor elimination. In Human-Computer Interaction (Part II), Proceedings of HCI International (HCII '03), 736--740. Lawrence Erlbaum, Mahwah, NJ.Google Scholar
- Brad A. Myers, Rishi Bhatnagar, Jeffrey Nichols, Choon Hong Peck, Dave Kong, Robert Miller, and A. Chris Long. 2002. Interacting at a distance: measuring the performance of laser pointers and other devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '02), 33--40. http://dx.doi.org/10.1145/503376.503383 Google ScholarDigital Library
- Dan R. Olsen, Jr. and Travis Nielsen. 2001. Laser pointer interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '01), 17--22. http://dx.doi.org/10.1145/365024.365030 Google ScholarDigital Library
- Andriy Pavlovych and Wolfgang Stuerzlinger. 2009. The tradeoff between spatial jitter and latency in pointing tasks. In Proceedings of the 1st ACM SIGCHI symposium on Engineering interactive computing systems (EICS '09), 187--196. http://dx.doi.org/10.1145/1570433.1570469 Google ScholarDigital Library
- Ernesto Ramos and David Donoho. 1983. The ASA data exposition dataset: Cars. http://archive.ics.uci.edu/ml/datasets/Auto+MPGGoogle Scholar
- Oleg Špakov. 2012. Comparison of eye movement filters used in HCI. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12), 281--284. http://dx.doi.org/10.1145/2168556.2168616 Google ScholarDigital Library
- Oleg Špakov. 2008. iComponent - DeviceIndependent Platform for Analyzing Eye Movement Data and Developing Eye-Based Applications. Dissertations in Interactive Technology 9, University of Tampere (2008). See also http://www.sis.uta.fi/~csolsp/downloads.phpGoogle Scholar
Index Terms
- GazeLaser: A Hands-Free Highlighting Technique for Presentations
Recommendations
MAGIC pointing for eyewear computers
ISWC '15: Proceedings of the 2015 ACM International Symposium on Wearable ComputersIn this paper, we propose a combination of head and eye movements for touchlessly controlling the "mouse pointer" on eyewear devices, exploiting the speed of eye pointing and accuracy of head pointing. The method is a wearable computer-targeted ...
Gliding and saccadic gaze gesture recognition in real time
Eye movements can be consciously controlled by humans to the extent of performing sequences of predefined movement patterns, or 'gaze gestures'. Gaze gestures can be tracked noninvasively employing a video-based eye tracking system. Gaze gestures hold ...
Just blink your eyes: a head-free gaze tracking system
CHI EA '03: CHI '03 Extended Abstracts on Human Factors in Computing SystemsWe propose a head-free, easy-setup gaze tracking system designed for a gaze-based Human-Computer Interaction. Our system enables the user to interact with the computer soon after catching the user's eye blinks. The user can move his/her head freely ...
Comments