ABSTRACT
"Tilt scrolling" and "peephole display" are popular user interface ideas for small computers, and inertial sensors were often the choice for the realization of such ideas. Inertial sensors, however, have two fundamental limitations; the frame of reference is not the user but the earth, and drifting errors are difficult to overcome. A possibly better solution, free from such limitations, is machine vision. Machine vision was a luxury for small computers but is becoming a practical solution because a camera is now a common component of small computers and the required vision algorithm is already running in optical mice. The vision algorithm of our prototype, which we called ISeeU, tracks simple features in the user's face and calculates lateral displacements and changes in distance, which in turn are used to control scrolling and zooming. An informal test with scrolling tasks indicates that its performance is comparable with a user interface using arrow keys.
- Rekimoto, J. Tilting operations for small screen interfaces. In Proceedings of ACM User Interface Software Technology(UIST 1996), 167--168. Google ScholarDigital Library
- Small, D., Ishii, H. Design of spatially aware graspable displays. In Proceedings of the SIGCHI conference on Human factors in computing systems (CHI 1997), 367--368. Google ScholarDigital Library
- Fitzmaurice, G. W. Situated information spaces and spatially aware palmtop computer. Communications of the ACM, Vol. 36, No. 7, 1993, 39--49. Google ScholarDigital Library
- Harrison, B.L., Fishkin, K.P., Gujar, A., Mochon, G., Want, R., Squeeze me, hold me, tilt me! An exploration of manipulative user interface. In Proceedings of the SIGCHI conference on Human factors in computing systems (CHI 1998), 17--24. Google ScholarDigital Library
- Bartlett, J.F. Rock'n' Scroll is Here to Stay, IEEE Computer Graphics and Applications, 2000, 40--45. Google ScholarDigital Library
- Hinckley, K., Pierce, H., Sinclair, M., Horvitz, E. Sensing techniques for mobile interaction. In Proceedings of ACM User Interface Software Technology(UIST 2000), 91--100. Google ScholarDigital Library
- Yee, K.P. Peephole Displays: Pen Interaction on Spatially Aware Handheld Computers. In Proceedings of the SIGCHI conference on Human factors in computing systems (CHI 2003), 1--8. Google ScholarDigital Library
- Ebisawa, Y. Improved Video-Based Eye-Gaze Detection Method. IEEE Transactions on Instrumentation and Measurement, Vol. 47, No. 4. 1998, 948--955.Google ScholarCross Ref
- Yoo, DH., Kim, JH., Lee, BR., Chung, MJ. Non-contact eye gaze tracking system by mapping of corneal reflections. IEEE International Conference on Automatic Face and Gesture Recognition. 2002, 94--99. Google ScholarDigital Library
- Intel Open Source Computer Vision Library. http://www.intel.com/research/mrl/research/opencv/Google Scholar
- Bouguet, J.V. Pyramidal implementation of the Lucas Kanade Feature Tracker Description of the algorithm. Intel Corporation Microprocessor Research Labs. 1999.Google Scholar
Index Terms
- ISeeU: camera-based user interface for a handheld computer
Recommendations
Pressure or Movement? Usability of Multi-Functional Foot-Based Interfaces
DIS '18: Proceedings of the 2018 Designing Interactive Systems ConferenceDespite considerable prior work exploring foot-based interaction techniques, direct comparisons of the performance of these approaches have been lacking. Here, we compare the performance of the two most common approaches found in previous studies: ...
Investigating Cross-Device Interaction between a Handheld Device and a Large Display
CHI '17: Proceedings of the 2017 CHI Conference on Human Factors in Computing SystemsThere is a growing interest in HCI research to explore cross-device interaction, giving rise to an interest in different approaches facilitating interaction between handheld devices and large displays. Contributing to this, we have investigated the use ...
Multi-touch interaction for robot control
IUI '09: Proceedings of the 14th international conference on Intelligent user interfacesRecent developments in multi-touch technologies have exposed fertile ground for research in enriched human-robot interaction. Although multi-touch technologies have been used for virtual 3D applications, to the authors' knowledge, ours is the first ...
Comments