ABSTRACT
Touch is a compelling input modality for interactive devices; however, touch input on the small screen of a mobile device is problematic because a user's fingers occlude the graphical elements he wishes to work with. In this paper, we present LucidTouch, a mobile device that addresses this limitation by allowing the user to control the application by touching the back of the device. The key to making this usable is what we call pseudo-transparency: by overlaying an image of the user's hands onto the screen, we create the illusion of the mobile device itself being semi-transparent. This pseudo-transparency allows users to accurately acquire targets while not occluding the screen with their fingers and hand. Lucid Touch also supports multi-touch input, allowing users to operate the device simultaneously with all 10 fingers. We present initial study results that indicate that many users found touching on the back to be preferable to touching on the front, due to reduced occlusion, higher precision, and the ability to make multi-finger input.
Supplemental Material
- Albinsson, P. and Zhai, S. 2003. High precision touch screen interaction. SIGCHI Conference on Human Factors in Computing. p.105--112. Google ScholarDigital Library
- Azuma, R.T. (1997). A Survey of Augmented Reality. Presence: Teleoperators and Virtual Environments 6(4) (August 1997). p. 355--385.Google Scholar
- Balakrishnan, R. and Kurtenbach, G. (1999). Exploring bimanual camera control and object manipulation in 3D graphics interfaces. Proceedings of CHI '99. p. 56--62. Google ScholarDigital Library
- Baudisch, P. and Gutwin, C. Multiblending: displaying overlapping windows simultaneously without the drawbacks of alpha blending. Proceeding of CHI 2004, p. 367--374. Google ScholarDigital Library
- Benko, H., Wilson, A., and Baudisch, P. (2006). Precise selection techniques for multi-touch screens. Proceedings of CHI '06. p. 1263--1272. Google ScholarDigital Library
- Buxton, W. (1990). A Three-State Model of Graphical Input. Human-Computer Interaction-INTERACT '90. p. 449--456. Google ScholarDigital Library
- Buxton, W. and Myers, B.A. (1986). A study in two-handed input. Proceedings of CHI '86. p. 321--326.Google ScholarDigital Library
- Card, S.K., English, W.K., Burr, B.J. (1978). Evaluation of Mouse, Rate-controlled Isometric Joystick, Step Keys, and Text Keys for Text Selection on a CRT. Ergonomics 21(8). p. 601--613.Google Scholar
- Dietz, P. and Leigh, D. (2001). DiamondTouch: a multi-user touch technology. Proceedings of UIST .01. p. 219--226. Google ScholarDigital Library
- Esenther, A., Ryall, K., (2006). Fluid DTMouse: Better Mouse Support for Touch-Based Interactions. Proceedings of AVI '06. p. 112--115. Google ScholarDigital Library
- Forlines, C., Shen, C., (2005). DTLens: Multi-User Tabletop Spatial Data Exploration. Proceedings of UIST '05. p. 119--122. Google ScholarDigital Library
- Forlines, C., Vogel, D., Balakrishnan, R. (2006). HybridPointing: Fluid switching between absolute and relative pointing with a direct input device. Proceedings of UIST '06. p. 211--220. Google ScholarDigital Library
- Guiard, Y. (1987) Asymmetric Division of Labor in Human Skilled Bimanual Action: The Kinematic Chain as a Model. Journal of Motor Behavior, 19. p. 486--517.Google ScholarCross Ref
- Han, J. (2005). Low-cost multi-touch sensing through frustrated total internal reflection. Proceedings of UIST .05. p. 115--118. Google ScholarDigital Library
- Hiraoka, S., Miyamoto, I., Tomimatsu, K. (2003) Behind Touch, a Text Input Method for Mobile Phones by The Back and Tactile Sense Interface. Information Processing Society of Japan, Interaction 2003. p. 131--138.Google Scholar
- Kabbash, P., Buxton, W., and Sellen, A. (1994). Two-handed input in a compound task. Proceedings of CHI '94. p. 417--423. Google ScholarDigital Library
- Leganchuk, A., Zhai, S., and Buxton, W. (1998). Manual and cognitive benefits of two-handed input: an experimental study. ACM Trans. Comput.-Hum. Interact. 5 (4) (Dec. 1998). p. 326--359. Google ScholarDigital Library
- MacKenzie, I. S., Soukoreff, R. W. (2002). A model of two-thumb text entry. Proceedings of GI .02. p. 117--124.Google Scholar
- Potter, R., Weldon, L., and Shneiderman, B. (1988). Improving the accuracy of touch screens: an experimental evaluation of three strategies. Proceedings of CHI .88. p. 27--32. Google ScholarDigital Library
- Rekimoto, J. (2002). SmartSkin: an infrastructure for freehand manipulation on interactive surfaces. Proceedings of CHI .02. p. 113--120. Google ScholarDigital Library
- Schilit, B., Adams, N.I., Want, R. (1994). Context-Aware Computing Applications. Proceedings of the Workshop on Mobile Computing Systems and Applications. p. 85--90.Google ScholarDigital Library
- Sears, A. and Shneiderman, B. (1991). High precision touchscreens: design strategies and comparisons with a mouse. International Journal of Man-Machine Studies, 34 (4). p. 593--613. Google ScholarDigital Library
- Sugimoto, M. Hiroki, K. (2006). HybridTouch: an intuitive manipulation technique for PDAs using their front and rear surfaces. Proceedings of MobileHCI '06, p. 137--140. Google ScholarDigital Library
- Tang, A., Neustaedter, C., Greenberg, S. (2004). VideoArms: Supporting Remote Embodiment in Groupware. Video Proceedings of CSCW .04.Google Scholar
- Tang, J. C. and Minneman, S. L. (1990). VideoDraw: a video interface for collaborative drawing. Proceedings of CHI '90. p. 313--320. Google ScholarDigital Library
- Ullmer, B. and Ishii, H. (1997). The metaDESK: models and prototypes for tangible user interfaces Proceedings of UIST '97. p. 223--232. Google ScholarDigital Library
- Vogel, D. and Baudisch, P. Shift: A Technique for Operating Pen-Based Interfaces Using Touch. To appear in Proceedings of CHI 2007. (in press). Google ScholarDigital Library
- Ward, D. J., Blackwell, A. F., MacKay, D. J. (2000). Dasher-a data entry interface using continuous gestures and language models. Proceedings of UIST '00. p. 129--137. Google ScholarDigital Library
- Wigdor, D., Leigh, D., Forlines, C., Shipman, S., Barnwell, J., Balakrishnan, R., Shen, C. (2006). Under the table interaction. Proceedings of UIST .06. p. 259--268. Google ScholarDigital Library
- Wilson, A. D. 2004. TouchLight: an imaging touch screen and display for gesture-based interaction. Proceedings of ICMI '04. p. 69--76. Google ScholarDigital Library
- Zhai, S., Hunter, M., Smith, B. A. (2000). The metropolis keyboard-an exploration of quantitative techniques for virtual keyboard design. Proceedings of UIST '00. p. 119--128. Google ScholarDigital Library
- Zhai, S., Kristensson, P. (2003). Shorthand writing on stylus keyboard. Proceedings of CHI '03. p. 97--104. Google ScholarDigital Library
Index Terms
- Lucid touch: a see-through mobile device
Recommendations
Wearables as Context for Guiard-abiding Bimanual Touch
UIST '16: Proceedings of the 29th Annual Symposium on User Interface Software and TechnologyWe explore the contextual details afforded by wearable devices to support multi-user, direct-touch interaction on electronic whiteboards in a way that-unlike previous work-can be fully consistent with natural bimanual-asymmetric interaction as set forth ...
Touch projector: mobile interaction through video
CHI '10: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsIn 1992, Tani et al. proposed remotely operating machines in a factory by manipulating a live video image on a computer screen. In this paper we revisit this metaphor and investigate its suitability for mobile use. We present Touch Projector, a system ...
Introducing Transient Gestures to Improve Pan and Zoom on Touch Surfaces
CHI '18: Proceedings of the 2018 CHI Conference on Human Factors in Computing SystemsDespite the ubiquity of touch-based input and the availability of increasingly computationally powerful touchscreen devices, there has been comparatively little work on enhancing basic canonical gestures such as swipe-to-pan and pinch-to-zoom. In this ...
Comments