ABSTRACT
We describe a mouseless, near-surface version of the Bimanual Marking Menu system. To activate the menu system, users create a pinch gesture with either their index or middle finger to initiate a left click or right click. Then they mark in the 3D space near the interactive area. We demonstrate how the system can be implemented using a commodity range camera such as the Microsoft Kinect, and report on several designs of the 3D marking system.
Like the multi-touch marking menu, our system offers a large number of accessible commands. Since it does not rely on contact points to operate, our system leaves the non-dominant hand available for other multi-touch interactions.
Supplemental Material
- Bailly, G., R. Walter, R. Müller, T. Ning, and E. Lecolinet. Comparing free hand menu techniques for distant displays using linear, marking and finger-count menus. Proceedings of IFIP INTERACT'11, pp. 248--262. Google ScholarDigital Library
- Benko, H. and A. D. Wilson, DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface, Microsoft Research, MSR-TR-2009-23, 2009.Google Scholar
- Chen, N., F. Guimbretière, and C. Loeckenhoff, Relative Role of Merging and Two Handed Operation on Command Selection Speed. International Journal of Human-Computer Studies, 2008. 66(10): p. 729--740. Google ScholarDigital Library
- Dang, C. T., M. Straub, and E. André. Hand distinction for multi-touch tabletop interaction. Proceedings of ITS'09, pp. 101--108. Google ScholarDigital Library
- Felzenszwalb, P. F. and D. P. Huttenlocher, Efficient Graph-Based Image Segmentation. International Journal of Computer Vision, 2004. 59(2): p. 167--181. Google ScholarDigital Library
- Grosjean, J., J.-M. Burkhardt, S. Coquillart, and P. Richard. Evaluation of the Command and Control Cube. Proceedings of ICMI'02, pp. 473--479. Google ScholarDigital Library
- Grossman, T., K. Hinckley, P. Baudisch, M. Agrawala, and R. Balakrishnan. Hover widgets: using the tracking state to extend the capabilities of pen-operated devices. Proceedings of CHI'06, pp. 861--870. Google ScholarDigital Library
- Grossman, T. and R. Balakrishnan. Collaborative interaction with volumetric displays. Proceedings of CHI'08, pp. 383--392. Google ScholarDigital Library
- Guimbretière, F. and T. Winograd. FlowMenu: combining command, text, and data entry. Proceedings of UIST'00, pp. 213--216. Google ScholarDigital Library
- Kurtenbach, G., The Design and Evaluation of Marking Menus. PhD thesis, University of Toronto, 1993. Google ScholarDigital Library
- Lenman, S., L. Bretzner, and B. Thuresson. Using marking menus to develop command sets for computer vision based hand gesture interfaces. Proceedings of NordiCHI'02, pp. 239--242. Google ScholarDigital Library
- Lepinski, G. J., T. Grossman, and G. Fitzmaurice. The design and evaluation of multitouch marking menus. Proceedings of CHI'10, pp. 2233--2242. Google ScholarDigital Library
- Loclair, C., S. Gustafson, and P. Baudisch. PinchWatch: A Wearable Device for One-Handed Microinteractions. Proceedings of MobileHCI'10, Workshop on Ensembles of On-Body Devices, pp. 1--4.Google Scholar
- Matejka, J., T. Grossman, J. Lo, and G. Fitzmaurice. The design and evaluation of multi-finger mouse emulation techniques. Proceedings of CHI'09, pp. 1073--1082. Google ScholarDigital Library
- Ni, T., R. P. McMahan, and D. A. Bowman. rapMenu: Remote Menu Selection Using Freehand Gestural Input. Proceedings of 3DUI'08, pp. 55--58. Google ScholarDigital Library
- Odell, D., R. Davis, A. Smith, and P. K. Wright. Toolglasses, Marking Menus, and Hotkeys: A Comparison of One and Two-Handed Command Selection Techniques. Proceedings of GI'04, pp. 17--24. Google ScholarDigital Library
- Pook, S., E. Lecolinet, G. Vaysseix, and E. Barillot. Control menus: excecution and control in a single interactor. Proceedings of CHI'00 Extended Abstracts, pp. 263--264. Google ScholarDigital Library
- Tsochantaridis, I., T. Hofmann, T. Joachims, and Y. Altun. Support vector machine learning for interdependent and structured output spaces. Proceedings of ICML'04, pp. 104--111. Google ScholarDigital Library
- Vijayanandh, R. and G. Balakrishnan. Human face detection using color spaces and region property measures. Proceedings of ICARCV'10, pp. 1605--1610.Google Scholar
- Wang, F., X. Cao, X. Ren, and P. Irani. Detecting and leveraging finger orientation for interaction with directtouch surfaces. Proceedings of UIST'09, pp. 23--32. Google ScholarDigital Library
- Wang, F. and X. Ren. Empirical evaluation for finger input properties in multi-touch interaction. Proceedings of CHI'09, pp. 1063--1072. Google ScholarDigital Library
- Wilson, A. D. Robust computer vision-based detection of pinching for one and two-handed gesture input. Proceedings of UIST'06, pp. 255--258. Google ScholarDigital Library
- Wobbrock, J. O., M. R. Morris, and A. D. Wilson. User-defined gestures for surface computing. Proceedings of CHI'09, pp. 1083--1092. Google ScholarDigital Library
- Zhao, S. and R. Balakrishnan. Simple vs. compound mark hierarchical marking menus. Proceedings of UIST'04, pp. 33--42. Google ScholarDigital Library
Index Terms
- Bimanual marking menu for near surface interactions
Recommendations
A framework for bimanual inter-device interactions
A shared interactive display (e.g., a tabletop) provides a large space for collaborative interactions. However, a public display lacks a private space for accessing sensitive information. On the other hand, a mobile device offers a private display and a ...
Enabling beyond-surface interactions for interactive surface with an invisible projection
UIST '10: Proceedings of the 23nd annual ACM symposium on User interface software and technologyThis paper presents a programmable infrared (IR) technique that utilizes invisible, programmable markers to support interaction beyond the surface of a diffused-illumination (DI) multi-touch system. We combine an IR projector and a standard color ...
Periphery triggered menus for head mounted menu interface interactions
OzCHI '16: Proceedings of the 28th Australian Conference on Computer-Human InteractionInteraction with head mounted displays is predominantly through use of additional controllers or the keyboard to initiate pointing, selections, and navigation. However, most modern head mounted displays have an orientation sensor to determine how the ...
Comments