skip to main content
10.1145/2513383.2513455acmconferencesArticle/Chapter ViewAbstractPublication PagesassetsConference Proceedingsconference-collections
research-article

Follow that sound: using sonification and corrective verbal feedback to teach touchscreen gestures

Published:21 October 2013Publication History

ABSTRACT

While sighted users may learn to perform touchscreen gestures through observation (e.g., of other users or video tutorials), such mechanisms are inaccessible for users with visual impairments. As a result, learning to perform gestures can be challenging. We propose and evaluate two techniques to teach touchscreen gestures to users with visual impairments: (1) corrective verbal feedback using text-to-speech and automatic analysis of the user's drawn gesture; (2) gesture sonification to generate sound based on finger touches, creating an audio representation of a gesture. To refine and evaluate the techniques, we conducted two controlled lab studies. The first study, with 12 sighted participants, compared parameters for sonifying gestures in an eyes-free scenario and identified pitch + stereo panning as the best combination. In the second study, 6 blind and low-vision participants completed gesture replication tasks with the two feedback techniques. Subjective data and preliminary performance findings indicate that the techniques offer complementary advantages.

References

  1. Azenkot, S., Wobbrock, J. O., Prasain, S., and Ladner, R. E. 2012. Input finger detection for nonvisual touchscreen text entry in Perkinput. Proc. Graphics Interface '12, 121--129. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Bau, O., Mackay, W. E. 2008. OctoPocus: a dynamic guide for learning gesture-based command sets. Proc. UIST '08, 37--46. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Bonner, M., Brudvik, J., Abowd, G., and Edwards, W. K. 2010. No-Look Notes: accessible eyes-free multi-touch text entry. Proc. Pervasive '10, 409--427. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Brown, L. M., & Brewster, S. A. 2003. Drawing by ear: Interpreting sonified line graphs. ICAD '03, 152--156.Google ScholarGoogle Scholar
  5. Crossan, A. & Brewster, S. 2008. Multimodal trajectory playback for teaching shape information and trajectories to visually impaired computer users. ACM TACCESS '08, 1(2), Article 12, 34 pages. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Frey, B., Southern, C., and Romero, M. 2011. Brailletouch: mobile texting for the visually impaired. Proc. HCI International '11, 19--25. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Guerreiro, T., Lagoá, P., Nicolau, H., Gonçalves, D., and Jorge, J. A. 2008. From tapping to touching: Making touchscreens accessible to blind users. IEEE MultiMedia '08, 48--50. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Harada, S., Takagi, H., Asakawa, C. 2011. On the audio representation of radial direction. Proc. CHI '11, 2779--2788. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Hoggan, E. and Brewster, S. 2012. Nonspeech Auditory and Crossmodal Output. In: The Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies and Emerging Applications. 211--236.Google ScholarGoogle Scholar
  10. Kamel, H. M., Roth, P., and Sinha, R. R. 2001. Graphics and user's exploration via simple sonics (GUESS): providing interrelational representation of objects in a non-visual environment. Proc. ICAD '01, 261--266.Google ScholarGoogle Scholar
  11. Kane, S. K., Bigham, J. P., Wobbrock, J. O. 2008. Slide rule: making mobile touchscreens accessible to blind people using multi-touch interaction techniques. Proc. ASSETS '08, 73--80. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Kane, S. K., Morris, M. R., Perkins, A. Z., Wigdor, D., Ladner, R. E., and Wobbrock, J. O. 2011. Access Overlays: Improving non-visual access to large touchscreens for blind users. Proc. UIST '11, 273--282. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Kane, S. K., Wobbrock, J. O. and Ladner, R. E. 2011. Usable gestures for blind people: Understanding preference and performance. Proc. CHI '11, 413--422. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Kristensson, P. O., and Denby, L. C. 2011. Continuous recognition and visualization of pen strokes and touch-screen gestures. Proc. SBIM '11, 95--102. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Leporini, B., Buzzi, M. C., and Buzzi, M. 2012. Interacting with mobile devices via VoiceOver: usability and accessibility issues. Proc. OzCHI '12, 339--348. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Noble, N., and Martin, B. 2006. Shape discovering using tactile guidance. Proc. EuroHaptics '06, 561--564.Google ScholarGoogle Scholar
  17. Norman, D. A. 2010 Natural user interfaces are not natural. Interactions, 17(3), 6--10. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Parhi, P., Karlson, A. K., and Bederson, B. B. 2006. Target size study for one-handed thumb use on small touchscreen devices. Proc. MobileHCI '06, 203--210. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Plimmer, B., Reid, P., Blagojevic, R., Crossan, A., and Brewster, S. 2011. Signing on the tactile line: A multimodal system for teaching handwriting to blind children. ACM TOCHI '11, 18(3), Article 17, 29 pages. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Su, J., Rosenzweig, A., Goel, A., de Lara, E., and Truong, K. N. 2010. Timbremap: enabling the visually-impaired to use maps on touch-enabled devices. Proc. MobileHCI '10, 17--26. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Walker, B. N., and Lindsay, J. 2005. Navigation performance in a virtual environment with bonephones. Proc. ICAD '05, 1--26.Google ScholarGoogle Scholar
  22. Walker, B. N., and Mauney, L. M. 2010. Universal design of auditory graphs: A comparison of sonification mappings for visually impaired and sighted listeners. TACCESS '10, 2(3), Article 12, 16 pages. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Yatani, K., and Truong, K. N. 2009. SemFeel: a user interface with semantic tactile feedback for mobile touch-screen devices. Proc. UIST '09, 111--120. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Yatani, K., Banovic, N., and Truong, K. 2012. SpaceSense: representing geographical information to visually impaired people using spatial tactile feedback. Proc. CHI '12, 415--424. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Zhao, H., Plaisant, C., Shneiderman, B., and Lazar, J. 2008. Data sonification for users with visual impairment: a case study with georeferenced data. ACM TOCHI '08, 15(1), Article 4, 28 pages. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Follow that sound: using sonification and corrective verbal feedback to teach touchscreen gestures

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Conferences
          ASSETS '13: Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility
          October 2013
          343 pages
          ISBN:9781450324052
          DOI:10.1145/2513383

          Copyright © 2013 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 21 October 2013

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article

          Acceptance Rates

          ASSETS '13 Paper Acceptance Rate28of98submissions,29%Overall Acceptance Rate436of1,556submissions,28%

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader