skip to main content
10.1145/1450579.1450625acmconferencesArticle/Chapter ViewAbstractPublication PagesvrstConference Proceedingsconference-collections
research-article

Opportunistic controls: leveraging natural affordances as tangible user interfaces for augmented reality

Published:27 October 2008Publication History

ABSTRACT

We present Opportunistic Controls, a class of user interaction techniques for augmented reality (AR) applications that support gesturing on, and receiving feedback from, otherwise unused affordances already present in the domain environment. Opportunistic Controls leverage characteristics of these affordances to provide passive haptics that ease gesture input, simplify gesture recognition, and provide tangible feedback to the user. 3D widgets are tightly coupled with affordances to provide visual feedback and hints about the functionality of the control. For example, a set of buttons is mapped to existing tactile features on domain objects. We describe examples of Opportunistic Controls that we have designed and implemented using optical marker tracking, combined with appearance-based gesture recognition. We present the results of a user study in which participants performed a simulated maintenance inspection of an aircraft engine using a set of virtual buttons implemented both as Opportunistic Controls and using simpler passive haptics. Opportunistic Controls allowed participants to complete their tasks significantly faster and were preferred over the baseline technique.

Skip Supplemental Material Section

Supplemental Material

file137-2.avi

avi

19 MB

References

  1. Da Pam 738--751, 1992. Functional Users Manual for The Army Maintenance Management System - Aviation (TAMMS-A). Washington D.C: U.S. Army.Google ScholarGoogle Scholar
  2. Blaskó, G. and Feiner, S. 2004. An interaction system for watch computers using tactile guidance and bidirectional segmented strokes. Proc. ISWC 2004. 120--123. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Bleser, G. and Stricker, D. 2008. Advanced tracking through efficient image processing and visual-inertial sensor fusion. Proc. IEEE Virtual Reality Conf. (VR '08), 137--144.Google ScholarGoogle Scholar
  4. Brooks, F. P., Ouh-Young, M., Batter, J. J. and Kilpatrick, P. J. 1990. Project GROPE-Haptic displays for scientific visualization. Proc. 17th Annual Conf. on Comp. Graphics and Interactive Techniques, 177--185. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Buxton, W., Hill, R. and Rowley, P. 1985. Issues and techniques in touch-sensitive tablet input, Proc. 12th Annual Conf. on Comp. Graphics and Interactive Techniques, 215--224. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Conner, B. D., Snibbe, S. S., Herndon, K. P., Robbins, D. C., Zeleznik, R. C. and Dam, A. V. 1992. Three-Dimensional Widgets. Proc. 1992 Symp. on Interactive 3D Graphics, 183--188. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Fiala, M. L. 2005. ARTag, a fiducial marker system using digital techniques. Proc. 2005 IEEE Computer Society Conf. on Comp. Vision and Pattern Recognition (CVPR'05) - Volume 2, 590--596. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Fishkin, K. P. 2004. A taxonomy for and analysis of tangible interfaces, Personal Ubiquitous Computing 8, 5, 347--358. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Gibson, J. 1986. The Ecological Approach to Visual Perception. Hillsdale, N.J.: Lawrence Erlbaum Associates.Google ScholarGoogle Scholar
  10. Hinckley, K., Pausch, R., Goble, J. C. and Kassell, N. F. 1994. Passive real-world interface props for neurosurgical visualization. Proc. SIGCHI Conf. on Human Factors in Computing Systems, 452--458. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Insko, B. E., Meehan, M. J., Whitton, M. C. and Frederick P. Brooks, J. 2001. Passive haptics significantly enhances virtual environments. Technical Report 0-493-17286-6. The University of North Carolina at Chapel Hill. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Ishii, H. and Ullmer, B. 1997. Tangible Bits: Towards seamless interfaces between people, bits and atoms. Proc. SIGCHI Conf. on Human Factors in Comp. Sys. 234--241. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Kjeldsen, R. and Kender, J. 1996. Finding skin in color images. Proc. 2nd International Conf. on Automatic Face and Gesture Recognition (FG '96), 312. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Klein, G. and Murray, D. 2007. Parallel tracking and mapping for small AR workspaces. Proc. International Symp. on Mixed and Augmented Reality (ISMAR'07). Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Lindeman, R. W., Sibert, J. L. and Hahn, J. K. 1999. Hand-held windows: towards effective 2D interaction in immersive virtual environments. Proc. IEEE Virtual Reality Conference, 205--212. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Murray-Smith, R., Williamson, J., Hughes, S. and Quaade, T. 2008. Stane: Synthesized surfaces for tactile input. Proc. of the Twenty-sixth annual SIGCHI Conf. on Human Factors in Comp. Systems, 1299--1302. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Norman, D. 1988. The Psychology of Everyday Things. New York: Basic Books.Google ScholarGoogle Scholar
  18. Roeber, H., Bacus, J. and Tomasi, C. 2003. Typing in thin air: The Canesta projection keyboard - a new method of interaction with electronic devices. CHI '03 Extended Abstracts on Human Factors in Computing Systems, 712--713. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. 3DV Systems, 2008. http://www.3dvsystems.comGoogle ScholarGoogle Scholar
  20. Szalavari, Z. and Gervautz, M. 1997. The personal interaction panel - a two-handed interface for augmented reality Computer Graphics Forum 16, 3, 335--346.Google ScholarGoogle Scholar
  21. Tomasi, C., Rafii, A. and Torunoglu, I. 2003. Full-size projection keyboard for handheld devices, Communications of the ACM 46, 7, 70--75. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Weimer, D. and Ganapathy, S. K. 1989. A synthetic visual environment with hand gesturing and voice input. Proc. SIGCHI Conf. on Human Factors in Comp. Systems, 235--240. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Opportunistic controls: leveraging natural affordances as tangible user interfaces for augmented reality

            Recommendations

            Comments

            Login options

            Check if you have access through your login credentials or your institution to get full access on this article.

            Sign in
            • Published in

              cover image ACM Conferences
              VRST '08: Proceedings of the 2008 ACM symposium on Virtual reality software and technology
              October 2008
              288 pages
              ISBN:9781595939517
              DOI:10.1145/1450579

              Copyright © 2008 ACM

              Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

              Publisher

              Association for Computing Machinery

              New York, NY, United States

              Publication History

              • Published: 27 October 2008

              Permissions

              Request permissions about this article.

              Request Permissions

              Check for updates

              Qualifiers

              • research-article

              Acceptance Rates

              VRST '08 Paper Acceptance Rate12of68submissions,18%Overall Acceptance Rate66of254submissions,26%

              Upcoming Conference

              VRST '24

            PDF Format

            View or Download as a PDF file.

            PDF

            eReader

            View online with eReader.

            eReader