ABSTRACT
While sighted users may learn to perform touchscreen gestures through observation (e.g., of other users or video tutorials), such mechanisms are inaccessible for users with visual impairments. As a result, learning to perform gestures can be challenging. We propose and evaluate two techniques to teach touchscreen gestures to users with visual impairments: (1) corrective verbal feedback using text-to-speech and automatic analysis of the user's drawn gesture; (2) gesture sonification to generate sound based on finger touches, creating an audio representation of a gesture. To refine and evaluate the techniques, we conducted two controlled lab studies. The first study, with 12 sighted participants, compared parameters for sonifying gestures in an eyes-free scenario and identified pitch + stereo panning as the best combination. In the second study, 6 blind and low-vision participants completed gesture replication tasks with the two feedback techniques. Subjective data and preliminary performance findings indicate that the techniques offer complementary advantages.
- Azenkot, S., Wobbrock, J. O., Prasain, S., and Ladner, R. E. 2012. Input finger detection for nonvisual touchscreen text entry in Perkinput. Proc. Graphics Interface '12, 121--129. Google ScholarDigital Library
- Bau, O., Mackay, W. E. 2008. OctoPocus: a dynamic guide for learning gesture-based command sets. Proc. UIST '08, 37--46. Google ScholarDigital Library
- Bonner, M., Brudvik, J., Abowd, G., and Edwards, W. K. 2010. No-Look Notes: accessible eyes-free multi-touch text entry. Proc. Pervasive '10, 409--427. Google ScholarDigital Library
- Brown, L. M., & Brewster, S. A. 2003. Drawing by ear: Interpreting sonified line graphs. ICAD '03, 152--156.Google Scholar
- Crossan, A. & Brewster, S. 2008. Multimodal trajectory playback for teaching shape information and trajectories to visually impaired computer users. ACM TACCESS '08, 1(2), Article 12, 34 pages. Google ScholarDigital Library
- Frey, B., Southern, C., and Romero, M. 2011. Brailletouch: mobile texting for the visually impaired. Proc. HCI International '11, 19--25. Google ScholarDigital Library
- Guerreiro, T., Lagoá, P., Nicolau, H., Gonçalves, D., and Jorge, J. A. 2008. From tapping to touching: Making touchscreens accessible to blind users. IEEE MultiMedia '08, 48--50. Google ScholarDigital Library
- Harada, S., Takagi, H., Asakawa, C. 2011. On the audio representation of radial direction. Proc. CHI '11, 2779--2788. Google ScholarDigital Library
- Hoggan, E. and Brewster, S. 2012. Nonspeech Auditory and Crossmodal Output. In: The Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies and Emerging Applications. 211--236.Google Scholar
- Kamel, H. M., Roth, P., and Sinha, R. R. 2001. Graphics and user's exploration via simple sonics (GUESS): providing interrelational representation of objects in a non-visual environment. Proc. ICAD '01, 261--266.Google Scholar
- Kane, S. K., Bigham, J. P., Wobbrock, J. O. 2008. Slide rule: making mobile touchscreens accessible to blind people using multi-touch interaction techniques. Proc. ASSETS '08, 73--80. Google ScholarDigital Library
- Kane, S. K., Morris, M. R., Perkins, A. Z., Wigdor, D., Ladner, R. E., and Wobbrock, J. O. 2011. Access Overlays: Improving non-visual access to large touchscreens for blind users. Proc. UIST '11, 273--282. Google ScholarDigital Library
- Kane, S. K., Wobbrock, J. O. and Ladner, R. E. 2011. Usable gestures for blind people: Understanding preference and performance. Proc. CHI '11, 413--422. Google ScholarDigital Library
- Kristensson, P. O., and Denby, L. C. 2011. Continuous recognition and visualization of pen strokes and touch-screen gestures. Proc. SBIM '11, 95--102. Google ScholarDigital Library
- Leporini, B., Buzzi, M. C., and Buzzi, M. 2012. Interacting with mobile devices via VoiceOver: usability and accessibility issues. Proc. OzCHI '12, 339--348. Google ScholarDigital Library
- Noble, N., and Martin, B. 2006. Shape discovering using tactile guidance. Proc. EuroHaptics '06, 561--564.Google Scholar
- Norman, D. A. 2010 Natural user interfaces are not natural. Interactions, 17(3), 6--10. Google ScholarDigital Library
- Parhi, P., Karlson, A. K., and Bederson, B. B. 2006. Target size study for one-handed thumb use on small touchscreen devices. Proc. MobileHCI '06, 203--210. Google ScholarDigital Library
- Plimmer, B., Reid, P., Blagojevic, R., Crossan, A., and Brewster, S. 2011. Signing on the tactile line: A multimodal system for teaching handwriting to blind children. ACM TOCHI '11, 18(3), Article 17, 29 pages. Google ScholarDigital Library
- Su, J., Rosenzweig, A., Goel, A., de Lara, E., and Truong, K. N. 2010. Timbremap: enabling the visually-impaired to use maps on touch-enabled devices. Proc. MobileHCI '10, 17--26. Google ScholarDigital Library
- Walker, B. N., and Lindsay, J. 2005. Navigation performance in a virtual environment with bonephones. Proc. ICAD '05, 1--26.Google Scholar
- Walker, B. N., and Mauney, L. M. 2010. Universal design of auditory graphs: A comparison of sonification mappings for visually impaired and sighted listeners. TACCESS '10, 2(3), Article 12, 16 pages. Google ScholarDigital Library
- Yatani, K., and Truong, K. N. 2009. SemFeel: a user interface with semantic tactile feedback for mobile touch-screen devices. Proc. UIST '09, 111--120. Google ScholarDigital Library
- Yatani, K., Banovic, N., and Truong, K. 2012. SpaceSense: representing geographical information to visually impaired people using spatial tactile feedback. Proc. CHI '12, 415--424. Google ScholarDigital Library
- Zhao, H., Plaisant, C., Shneiderman, B., and Lazar, J. 2008. Data sonification for users with visual impairment: a case study with georeferenced data. ACM TOCHI '08, 15(1), Article 4, 28 pages. Google ScholarDigital Library
Index Terms
- Follow that sound: using sonification and corrective verbal feedback to teach touchscreen gestures
Recommendations
Audio-Based Feedback Techniques for Teaching Touchscreen Gestures
Special Issue (Part 2) of Papers from ASSETS 2013While sighted users may learn to perform touchscreen gestures through observation (e.g., of other users or video tutorials), such mechanisms are inaccessible for users with visual impairments. As a result, learning to perform gestures without visual ...
Access overlays: improving non-visual access to large touch screens for blind users
UIST '11: Proceedings of the 24th annual ACM symposium on User interface software and technologyMany touch screens remain inaccessible to blind users, and those approaches to providing access that do exist offer minimal support for interacting with large touch screens or spatial data. In this paper, we introduce a set of three software-based ...
Recognizing shapes and gestures using sound as feedback
CHI EA '10: CHI '10 Extended Abstracts on Human Factors in Computing SystemsThe main goal of this research work is to show the possibility of using sound feedback techniques to recognize shapes and gestures. The system is based on the idea of relating spatial representations to sound. The shapes are predefined and the user has ...
Comments