skip to main content
10.1145/1878803.1878837acmconferencesArticle/Chapter ViewAbstractPublication PagesassetsConference Proceedingsconference-collections
research-article

Leveraging proprioception to make mobile phones more accessible to users with visual impairments

Published:25 October 2010Publication History

ABSTRACT

Accessing the advanced functions of a mobile phone is not a trivial task for users with visual impairments. They rely on screen readers and voice commands to discover and execute functions. In mobile situations, however, screen readers are not ideal because users may depend on their hearing for safety, and voice commands are difficult for a system to recognize in noisy environments. In this paper, we extend Virtual Shelves--an interaction technique that leverages proprioception to access application shortcuts--for visually impaired users. We measured the directional accuracy of visually impaired participants and found that they were less accurate than people with vision. We then built a functional prototype that uses an accelerometer and a gyroscope to sense its position and orientation. Finally, we evaluated the interaction and prototype by allowing participants to customize the placement of seven shortcuts within 15 regions. Participants were able to access shortcuts in their personal layout with 88.3% accuracy in an average of 1.74 seconds.

References

  1. Adamo D.E., Alexander N.B., Brown S.H. 2009. The influence of age and physical activity on upper limb proprioceptive ability. Journal of Aging and Physical Activity. 17, 3 (July 2009), 272--293.Google ScholarGoogle ScholarCross RefCross Ref
  2. Amar, R., Dow, S., Gordon, R., Hamid, M. R., and Sellers, C. 2003. Mobile ADVICE: an accessible device for visually impaired capability enhancement. In Proc of CHI '03 Extended Abstracts on Human Factors in Computing Systems. ACM, New York, NY, 918--919. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Biggs, J. and Srinivasan, M.A. 2002. Haptic Interfaces. In Stanney, K (ed.), Handbook of Virtual Environments Design, Implementation, and Applications. Lawrence Erlbaum Associates.Google ScholarGoogle Scholar
  4. Blasko, G., Narayanaswami, C., and Feiner, S. 2006. Prototyping retractable string-based interaction techniques for dual-display mobile devices. In Proc of CHI'06. ACM Press, New York, NY, 369--372. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Brewster, S., Lumsden, J., Bell, M., Hall, M., and Tasker, S. 2003. Multimodal 'eyes-free' interaction techniques for wearable devices. In Proc of CHI'03. ACM, New York, NY, 473--480. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Cao, X., Massimi, M., and Balakrishnan, R. 2008. Flashlight jigsaw: an exploratory study of an ad-hoc multi-player game on public displays. In Proc of CSCW'08. ACM Press, New York, NY, 77--86. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Clark, F.J. 1992. How accurately can we perceive positions of our limbs? Behavior and Brain Sciences, 15, 4 (December 1992), 725--726.Google ScholarGoogle Scholar
  8. Feiner, S. and Shamash, A. 1991. Hybrid user interfaces: Breeding virtually bigger interfaces for physically smaller computers. In Proc of UIST'91. ACM, New York, NY, 9--17. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Foxlin, E. 1996. Inertial Head-Tracker Sensor Fusion by a Complimentary Separate-Bias Kalman Filter. In Proc of VRAIS 96. VRAIS. IEEE, Washington, DC, 185. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Hansen, T.R., Eriksson, E., and Lykke-Olesen, A. 2006. Use your head: exploring face tracking for mobile interaction. In CHI'06 Extended Abstracts on Human Factors in Computing Systems. ACM Press, New York, NY, 845--850. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Harrison, C. and Hudson, S. E. 2009. Abracadabra: Wireless, High-Precision, and Unpowered Finger Input for Very Small Mobile Devices. In Proc of UIST '09. ACM, New York, NY, 121--124. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Hinckley, K., Pierce, J., Sinclair, M., and Horvitz, E. 2000. Sensing Techniques for Mobile Interaction. In Proc of UIST '00. ACM Press, New York, NY, 91--100. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Hsieh, T., Wang, Q.Y., and Paepcke A. 2009. Piles across space: Breaking the real-estate barrier on small-display devices. International Journal of Human-Computer Studies. 67, 4 (April 2009), 349--365. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Kane, S. K., Bigham, J. P., and Wobbrock, J. O. 2008. Slide rule: making mobile touch screens accessible to blind people using multi-touch interaction techniques. In Proc of Assets '08. ACM, New York, NY, 73--80. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Li, F.C.Y., Dearman, D., and Truong, K. Virtual Shelves. 2009. Interactions with Orientation Aware Devices. In Proc of UIST '09. ACM Press, New York, NY, 125--128. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Li, K. A., Baudisch, P., and Hinckley, K. 2008. Blindsight: eyes-free access to mobile phones. In Proc of CHI '08. ACM, New York, NY, 1389--1398. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Miller, G. A. The magical number seven, plus or minus two: Some limits on our capacity for processing information. Psychological Review. 63, 2 (1956), 81--97.Google ScholarGoogle Scholar
  18. O'Neill, E., Kaenampornpan, M., Kostakos, V., Warr, A., and Woodgate, D. 2006. Can we do without GUIs? Gesture and speech interaction with a patient information system. Personal Ubiquitous Comput. 10, 5 (Jul. 2006), 269--283. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Oakley, I. and Park, J. A motion-based marking menu system. 2007. In Proc of CHI'07 Extended Abstracts on Human Factors in Computing Systems. ACM Press, New York, NY, 2597--2602. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Pirhonen, A., Brewster, S., and Holguin, C. 2002. Gestural and audio metaphors as a means of control for mobile devices. In Proc of CHI '02. ACM, New York, NY, 291--298. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Rekimoto, J. 1996. Tilting operations for small screen interfaces. In Proc of UIST '96. ACM, New York, NY, 167--168. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Roumeliotis S. I. and Bekey, G.A. 1997. An Extended Kalman Filter for frequent local and infrequent global sensor data fusion. In Proc of the Sensor Fusion and Decentralized Control in Autonomous Robotic Systems. SPIE, Bellingham WA, 11--22.Google ScholarGoogle Scholar
  23. Sawhney, N. and Schmandt, C. 2000. Nomadic radio: speech and audio interaction for contextual messaging in nomadic environments. ACM Trans. Comput.-Hum. Interact. 7, 3 (Sep. 2000), 353--383. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Stifelman, L. J., Arons, B., Schmandt, C., and Hulteen, E. A. 1993. VoiceNotes: a speech interface for a hand-held voice notetaker. In Proc of the INTERACT '93 and CHI '93. ACM, New York, NY, 179--186. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Suzuki, Y., Nakadai, Y., Shimamura, Y., Nishino, Y. 1998. Development of an Integrated Wristwatch-type PHS Telephone. NTT Review, Vol.10, No.6, 93--101.Google ScholarGoogle Scholar
  26. Vaganay, J., Aldon, M.J., Fournier, A. 1993. Mobile robot attitude estimation by fusion of inertial data. In Proc of the 1993 IEEE International Conference on Robotics and Automation. IEEE, Washington, DC, 277--282.Google ScholarGoogle ScholarCross RefCross Ref
  27. Wilson, A. and Shafer, S. 2003. XWand: UI for intelligent spaces. In Proc of the SIGCHI Conference on Human Factors in Computing Systems. CHI '03. ACM, New York, NY, 545--552. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Yee, K.P. 2003. Peephole Displays: Pen Interaction on Spatially Aware Handheld Computers. In Proc of CHI '03. ACM Press, New York, NY, 1--8. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Yfantidis, G. and Evreinov, G. 2006. Adaptive blind interaction technique for touchscreens. Univers. Access Inf. Soc. 4, 4 (May. 2006), 328--337. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Zhao, S., Dragicevic, P., Chignell, M., Balakrishnan, R. and Baudisch, P. 2007. EarPod: eyes-free menu selection using touch input and reactive audio feedback. In Proc of CHI'07. ACM Press, New York, NY, 1395--1404. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Leveraging proprioception to make mobile phones more accessible to users with visual impairments

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        ASSETS '10: Proceedings of the 12th international ACM SIGACCESS conference on Computers and accessibility
        October 2010
        346 pages
        ISBN:9781605588810
        DOI:10.1145/1878803

        Copyright © 2010 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 25 October 2010

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        Overall Acceptance Rate436of1,556submissions,28%

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader