ABSTRACT
In order to enable personalized functionality, such as to log tabletop activity by user, tabletop systems need to recognize users. DiamondTouch does so reliably, but requires users to stay in assigned seats and cannot recognize users across sessions. We propose a different approach based on distinguishing users' shoes. While users are interacting with the table, our system Bootstrapper observes their shoes using one or more depth cameras mounted to the edge of the table. It then identifies users by matching camera images with a database of known shoe images. When multiple users interact, Bootstrapper associates touches with shoes based on hand orientation. The approach can be implemented using consumer depth cameras because (1) shoes offer large distinct features such as color, (2) shoes naturally align themselves with the ground, giving the system a well-defined perspective and thus reduced ambiguity. We report two simple studies in which Bootstrapper recognized participants from a database of 18 users with 95.8% accuracy.
Supplemental Material
- Abate, A. F., Nappi, M., Riccio, D., Sabatino, G. 2D and 3D face recognition: A survey. Pattern Recognition Letters Vol.28, Issue 14, 1885--1906. Google ScholarDigital Library
- Annett, M., Grossman, T., Wigdor, D., Fitzmaurice, G. Medusa: A Proximity-Aware Multi-touch Tabletop. Proc. UIST'11. Google ScholarDigital Library
- Augsten, T., Kaefer, K., Fetzer, C., Meusel, R., Kanitz, D., Stoff, T., Becker, T., Holz, C., Baudisch, P. Multitoe: High-Precision Interaction with Back-Projected Floors Based on High-Resolution Multi-Touch Input. Proc. UIST'10, 209--218. Google ScholarDigital Library
- Bay, H., Ess, A., Tuytelaars, T., Van Gool, L. Speeded-up Robust Features. Proc. Computer Vision and Image Understanding (2007), 346--359. Google ScholarDigital Library
- Dietz, P. and Leigh, D. DiamondTouch: a multi-user touch technology. Proc. UIST '01, 219--226. Google ScholarDigital Library
- Holz, C. and Baudisch, P. The Generalized Perceived Input Point Model and How to Double Touch Accuracy by Extracting Fingerprints. Proc. CHI '10, 581--590. Google ScholarDigital Library
- Holz, C. and Baudisch, P. Understanding Touch. Proc. CHI '11, 2501--2510. Google ScholarDigital Library
- Lepinski, J. G., Grossman, T., Fitzmaurice, G. The design and evaluation of multitouch marking menus. Proc. CHI '10, 2233--2242. Google ScholarDigital Library
- Lowe, D. Distinctive Image Features from Scale-Invariant Keypoints. International Journal of Computer Vision (2004). Google ScholarDigital Library
- Matsushita, N. and Rekimoto, J. HoloWall: Designing a Finger, Hand, Body, and Object Sensitive Wall. Proc. UIST'97, 209--210. Google ScholarDigital Library
- Meyer, T. and Schmidt, D. IdWristbands: IR-based User Identification on Multi-touch surfaces. Poster at ITS 2010. Google ScholarDigital Library
- Olwal, A. and Wilson, A. SurfaceFusion: Unobtrusive Tracking of Everyday Objects in Tangible User Interfaces. Proc. GI '08, 235--242. Google ScholarDigital Library
- OpenCV. http://opencv.willowgarage.comGoogle Scholar
- Orr, R. J. and Abowd, G. D. The smart floor: a mechanism for natural user identification and tracking. In CHI'00 Extended Abstracts, 275--276. Google ScholarDigital Library
- Piper, A., O'Brien, Ringel Morris, M., and Winograd, T. SIDES: a cooperative tabletop computer game for social skills development. Proc. CSCW'06, 1--10. Google ScholarDigital Library
- Roth, V., Schmidt, P., and Güldenring, B. The IR Ring: Authenticating users' touches on a multi-touch display. Proc. UIST '10, 259--262. Google ScholarDigital Library
- Schmidt, D., Chong, M., and Gellersen, H. HandsDown: Hand-contour-based user identification for interactive surfaces. Proc. NordiCHI '10, 432--441. Google ScholarDigital Library
- Sugiura, A., and Koseki, Y. A user interface using fingerprint recognition: holding commands and data objects on fingers. Proc. UIST '98, 71--79. Google ScholarDigital Library
Index Terms
- Bootstrapper: recognizing tabletop users by their shoes
Recommendations
Identification of the User by Analyzing Human Computer Interaction
Proceedings of the 13th International Conference on Human-Computer Interaction. Part III: Ubiquitous and Intelligent InteractionThis paper describes a study analyzing the interaction of users with a computer system to show that user identification is possible only by analyzing the user interaction behavior. The identification of the user can be done with a precision of up to ...
A Tabletop System Using an OmniDirectional Projector-Camera
ISS '18: Proceedings of the 2018 ACM International Conference on Interactive Surfaces and SpacesWe propose an omnidirectional projection system embedding a projector with an ultra wide-angle lens in the table. AR markers are attached on the target surfaces so that the system can track them with an omnidirectional camera. Finally, we developed a ...
YouTouch! Low-Cost User Identification at an Interactive Display Wall
AVI '16: Proceedings of the International Working Conference on Advanced Visual InterfacesWe present YouTouch!, a system that tracks users in front of an interactive display wall and associates touches with users. With their large size, display walls are inherently suitable for multi-user interaction. However, current touch recognition ...
Comments