ABSTRACT
Sensing the depth (distance from the surface) of fingers/hands near a tabletop is very important. It allows us to use three-dimensional (3D) gesture interaction in multi-touch applications as we do in the real world. We introduce Z-touch, a multi-touch table that can sense the approximate postures of fingers or hands in the proximity of the tabletop's surface. Z-touch uses a vision-based posture sensing system. Multilayered infrared (IR) laser planes are synchronized with shutter signals from a high-speed camera, which captures each layer of the laser images. A depth map is obtained by using the captured image. Our prototype works at ~30 fps. Z-touch not only uses with the finger/hand contact points but also the angle of the hovering fingers. The interaction with the finger angles is unique and allows users to control multiple parameters by using a single finger. In this study, we introduce the principle of the method of finger detection and its applications (e.g., drawing, map zooming viewer, Bezier curve control).
Supplemental Material
- H. Benko and A. D. Wilson. DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface. Technical Report MSR-TR-2009-23, Microsoft Research, 2009.Google Scholar
- C. T. Dang, M. Straub, and E. André. Hand distinction for multi-touch tabletop interaction. In ITS '09: Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces, pages 101--108. ACM, 2009. Google ScholarDigital Library
- J. Y. Han. Low-cost multi-touch sensing through frustrated total internal reflection. In UIST '05: Proceedings of the 18th annual ACM symposium on User interface software and technology, pages 115--118. ACM, 2005. Google ScholarDigital Library
- M. Hirsch, D. Lanman, H. Holtzman, and R. Raskar. BiDi screen: a thin, depth-sensing LCD for 3D interaction using light fields. In SIGGRAPH Asia '09: ACM SIGGRAPH Asia 2009 papers, pages 1--9. ACM, 2009. Google ScholarDigital Library
- B. Leibe, T. Starner, W. Ribarsky, Z. Wartell, D. Krum, B. Singletary, and L. Hodges. The perceptive workbench: Towards spontaneous and natural interaction in semi-immersive virtual environments. In VR '2000: IEEE Virtual Reality 2000 Conference, pages 13--20. IEEE Computer Society, 2000. Google ScholarDigital Library
- J. Rekimoto. SmartSkin: an infrastructure for freehand manipulation on interactive surfaces. In CHI '02: Proceedings of the SIGCHI conference on Human factors in computing systems, pages 113--120. ACM, 2002. Google ScholarDigital Library
- S. Subramanian, D. Aliakseyeu, and A. Lucero. Multilayer interaction for digital tables. In UIST '06: Proceedings of the 19th annual ACM symposium on User interface software and technology, pages 269--272. ACM, 2006. Google ScholarDigital Library
- Y. Takeoka, T. Miyaki, and J. Rekimoto. Z-touch: a multi-touch system for detecting spatial gestures near the tabletop. In SIGGRAPH '10: ACM SIGGRAPH 2010 Talks. ACM, 2010. Google ScholarDigital Library
- A. D. Wilson. TouchLight: an imaging touch screen and display for gesture-based interaction. In ICMI '04: Proceedings of the 6th international conference on Multimodal interfaces, pages 69--76. ACM, 2004. Google ScholarDigital Library
- A. D. Wilson. Depth-Sensing Video Cameras for 3D Tangible Tabletop Interaction. In TABLETOP '07: IEEE International Workshop on Horizontal Interactive Human-Computer Systems, pages 201--204. IEEE Computer Society, 2007.Google ScholarCross Ref
Index Terms
- Z-touch: an infrastructure for 3d gesture interaction in the proximity of tabletop surfaces
Recommendations
Gaze-touch: combining gaze with multi-touch for interaction on the same surface
UIST '14: Proceedings of the 27th annual ACM symposium on User interface software and technologyGaze has the potential to complement multi-touch for interaction on the same surface. We present gaze-touch, a technique that combines the two modalities based on the principle of 'gaze selects, touch manipulates'. Gaze is used to select a target, and ...
Exploring gesture-based interaction techniques in multi-display environments with mobile phones and a multi-touch table
AVI '10: Proceedings of the International Conference on Advanced Visual InterfacesIn this paper, we explore the potential of combining shared and interactive displays (e.g. a multi-touch table) with personal devices (e.g. mobile phones) as an important class of heterogeneous multi-display environments. Within six case studies ...
Using tangible drawing tools on a capacitive multi-touch display
BCS-HCI '12: Proceedings of the 26th Annual BCS Interaction Specialist Group Conference on People and ComputersWe present an innovative drawing tool that can detect tangible drawing instruments on a capacitive multi-touch tablet. There are three core components to the system: the tangible hardware, the recognizer used to identify the tangibles, and the drawing ...
Comments