ABSTRACT
Text input with eye trackers can be implemented in many ways such as on-screen keyboards or context sensitive menu-selection techniques. We propose the use of off-screen targets and various schemes for decoding target hit sequences into text. Off-screen targets help to avoid the Midas' touch problem and conserve display area. However, the number and location of the off-screen targets is a major usability issue. We discuss the use of Morse code, our Minimal Device Independent Text Input Method (MDITIM), QuikWriting, and Cirrin-like target arrangements. Furthermore, we describe our experience with an experimental system that implements eye tracker controlled MDITIM for the Windows environment.
- 1.P. Isokoski, and R. Raisamo. Device Independent Text Input: A Rationale and an Example, Proceedings of A VI 2000 Conference on Advanced Visual Interfaces, pages 76 - 83, ACM, New York, 2000. Google ScholarDigital Library
- 2.R. J. K. Jacob. Eye Movement-Based Human-computer Interaction Techniques: Toward Non-Command Interfaces, in H. R. Hartson and D. Hix, editors, Advances in Human- Computer Interaction, pages 151 - 190, Ablex Publishing Co.,: Norwood, N.J., 1993.Google Scholar
- 3.J. Mankoff, and G. D. Abowd. Cirrin: a word-level unistroke keyboard for pen input. Proceedings of UIST'98 ACM Symposium on User Interface Software and Technology, pages 213-214, ACM, New York, 1998. Google ScholarDigital Library
- 4.Morse Code Input System for the Windows 2000 Operating System (Proposal draft), MORSE 2000 Outreach, http://www.uwec.edu/acadeanic/hss-or/Morse2000 /M orseSpecification.doc.Google Scholar
- 5.K. Perlin. Quikwriting: continuous stylus-based text entry. Proceedings UIST'98 ACM Symposium on User Interface Software and Technology, pages 215-216, ACM, New York, 1998. Google ScholarDigital Library
- 6.QuikWriting 2.1 release notes, http://www.md.nvu.edu /perlin/demos/quikwriting2 l-rel-notes.html, 2000.Google Scholar
- 7.D.D. Salvucci, and J. R. Anderson. Intelligent Gaze-Added Interfaces, Proceedings of the CHI 2000, pages 273-280 ACM,New York, 2000. Google ScholarDigital Library
- 8.SensoMotoric Insturments, Adaptative visual keyboard, http://www.smi.de/em/index.htmlGoogle Scholar
- 9.SensoMotoric Instuments, EyeLink Gaze Tracking, http://www.smi.de/el/index.htmlGoogle Scholar
- 10.L. E. Sibert, and R. J. K. Jacob. Evaluation of Eye Gaze Interaction, Proceedings of the CHI 2000, pages 281-288, ACM, New York, 2000. Google ScholarDigital Library
- 11.R. W. Soukoreff, and I. S. MacKenzie, Theoretical upper and lower bounds on typing speed using a stylus and soi~ keyboard, Behaviour & Information Technology, 14(6), 370- 379, Taylor & Francis ltd., London, 1995.Google Scholar
- 12.V. Tanriverdi, and R. J. K. Jacob, Interacting with eye movements in virtual environments, Proceedings of the CHI 2000, pages 265-272, ACM, New York, 2000. Google ScholarDigital Library
- 13.S. Zhai, C. Morimoto, and S. Ihde. Manual And Gaze Input Cascaded (MAGIC) Pointing, Proceedings of the CHI 99, pages 246-253, ACM, New York, 1999. Google ScholarDigital Library
Index Terms
- Text input methods for eye trackers using off-screen targets
Recommendations
Gestural Text Input Using a Smartwatch
AVI '16: Proceedings of the International Working Conference on Advanced Visual InterfacesOne challenge with modern smartwatches is text input. In this paper we explore the use of gestural interaction with a smartwatch to support text input. The inertial measurement unit of a smartwatch is used to capture gestural interaction by a user, and ...
Stylus based text input using expanding CIRRIN
AVI '06: Proceedings of the working conference on Advanced visual interfacesCIRRIN [3] is a stylus based text input technique for mobile devices with a touch sensitive display. In this paper we explore the benefit of expanding the letters of CIRRIN to reduce the overall difficulty of selecting a letter. We adapted the existing ...
Eye-based head gestures
ETRA '12: Proceedings of the Symposium on Eye Tracking Research and ApplicationsA novel method for video-based head gesture recognition using eye information by an eye tracker has been proposed. The method uses a combination of gaze and eye movement to infer head gestures. Compared to other gesture-based methods a major advantage ...
Comments