skip to main content
10.1145/1520340.1520626acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
extended-abstract

WUW - wear Ur world: a wearable gestural interface

Authors Info & Claims
Published:04 April 2009Publication History

ABSTRACT

Information is traditionally confined to paper or digitally to a screen. In this paper, we introduce WUW, a wearable gestural interface, which attempts to bring information out into the tangible world. By using a tiny projector and a camera mounted on a hat or coupled in a pendant like wearable device, WUW sees what the user sees and visually augments surfaces or physical objects the user is interacting with. WUW projects information onto surfaces, walls, and physical objects around us, and lets the user interact with the projected information through natural hand gestures, arm movements or interaction with the object itself.

References

  1. Agarwal, A., Izadi, S., Chandraker, M. and Blake, A. High precision multi-touch sensing on surfaces using overhead cameras. IEEE International Workshop on Horizontal Interactive Human-Computer System, 2007.Google ScholarGoogle ScholarCross RefCross Ref
  2. Apple iPhone. http://www.apple.com/iphone.Google ScholarGoogle Scholar
  3. Azuma, R., Baillot, Y., Behringer, R., Feiner, S., Julier, S., MacIntyre, B. Recent Advanced in Augmented Reality. IEEE Computer Graphics and Applications, Nov-Dec. 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Buxton, B. Multi-touch systems that i have known and loved. http://www.billbuxton.com/multitouchOverview.html.Google ScholarGoogle Scholar
  5. Dietz, P. and Leigh, D. DiamondTouch: a multi-user touch technology. In Proc. UIST 2001, Orlando, FL, USA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Han, J. Low-cost multi-touch sensing through frustrated total internal reflection. In Proc. UIST 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Letessier, J. and Bérard, F. Visual tracking of bare fingers for interactive surfaces. In Proc. UIST 2004, Santa Fe, NM, USA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. LM3LABS Ubiq'window. http://www.ubiqwindow.jpGoogle ScholarGoogle Scholar
  9. Malik, S. and Laszlo, J. Visual touchpad: a two-handed gestural input device. In Proc. ICMI 2004, State College, PA, USA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Matsushita, N. and Rekimoto, J. HoloWall: designing a finger, hand, body, and object sensitive wall. In Proc. UIST 1997, Banff, Alberta, Canada Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Microsoft Surface. http://www.surface.comGoogle ScholarGoogle Scholar
  12. Oblong G-speak. http://www.oblong.comGoogle ScholarGoogle Scholar
  13. Perceptive Pixel. http://www.perceptivepixel.comGoogle ScholarGoogle Scholar
  14. Pinhanez, C. The Everywhere Displays Projector: A Device to Create Ubiquitous Graphical Interfaces. In Proc. Ubicomp 2001, Atlanta, Georgia, USA Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Raskar, R., Baar, J.V., Beardsley, P., Willwacher, T., Rao, S., Forlines, C. iLamps: geometrically aware and self-configuring projectors, In Proc. ACM SIGGRAPH 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Rekimoto, J., SmartSkin: An Infrastructure for Freehand Manipulation on Interactive Surfaces. In Proc. CHI 2002, ACM Press (2002), 113--120. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Segen, J. and Kumar, S. Shadow Gestures: 3D Hand Pose Estimation Using a Single Camera. In Proc. CVPR1999, (1999), 479--485.Google ScholarGoogle ScholarCross RefCross Ref
  18. Starner, T., Weaver, J. and Pentland, A. A Wearable Computer Based American Sign Language Recognizer. In Proc. ISWC'1997, (1997), 130--137. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Starner, T., Auxier, J., Ashbrook, D. and Gandy, M. The Gesture Pendant: A Self-illuminating, Wearable, Infrared Computer Vision System for Home Automation Control and Medical Monitoring. In Proc. ISWC'2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Ukita, N. and Kidode, M. Wearable virtual tablet: fingertip drawing on a portable plane-object using an active-infrared camera, In Proc. IUI2004, Funchal, Madeira, Portugal. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Wilson, A. PlayAnywhere: a compact interactive tabletop projection-vision system, In Proc.UIST2005, Seattle, USA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Wilson, A. TouchLight: an imaging touch screen and display for gesture-based interaction, In Proc. ICMI 2004, State College, PA, USA. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. WUW - wear Ur world: a wearable gestural interface

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Conferences
          CHI EA '09: CHI '09 Extended Abstracts on Human Factors in Computing Systems
          April 2009
          2470 pages
          ISBN:9781605582474
          DOI:10.1145/1520340

          Copyright © 2009 Copyright is held by the owner/author(s)

          Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 4 April 2009

          Check for updates

          Qualifiers

          • extended-abstract

          Acceptance Rates

          CHI EA '09 Paper Acceptance Rate385of1,130submissions,34%Overall Acceptance Rate6,164of23,696submissions,26%

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader