skip to main content
10.1145/2513383.2513440acmconferencesArticle/Chapter ViewAbstractPublication PagesassetsConference Proceedingsconference-collections
research-article

Exploring the use of speech input by blind people on mobile devices

Published:21 October 2013Publication History

ABSTRACT

Much recent work has explored the challenge of nonvisual text entry on mobile devices. While researchers have attempted to solve this problem with gestures, we explore a different modality: speech. We conducted a survey with 169 blind and sighted participants to investigate how often, what for, and why blind people used speech for input on their mobile devices. We found that blind people used speech more often and input longer messages than sighted people. We then conducted a study with 8 blind people to observe how they used speech input on an iPod compared with the on-screen keyboard with VoiceOver. We found that speech was nearly 5 times as fast as the keyboard. While participants were mostly satisfied with speech input, editing recognition errors was frustrating. Participants spent an average of 80.3% of their time editing. Finally, we propose challenges for future work, including more efficient eyes-free editing and better error detection methods for reviewing text.

References

  1. AirServer. http://www.airserver.com/.Google ScholarGoogle Scholar
  2. Apple Inc., AirPlay. http://www.apple.com/airplay/Google ScholarGoogle Scholar
  3. Apple Inc., Chapter 11: Using VoiceOver Gestures. From VoiceOver Getting Started. http://www.apple.com/voiceover/info/guide/_1137.html#vo28035.Google ScholarGoogle Scholar
  4. Apple Inc., Siri. http://www.apple.com/ios/siri/Google ScholarGoogle Scholar
  5. Azenkot, S., Wobbrock, J. O., Prasain, S., and Ladner, R. E. Input finger detection for nonvisual touch screen text entry in Perkinput. Proc. GI '12, 121--129. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Bonner, M., Brudvik, J., Abowd, G. Edwards, K. (2010). No-Look Notes: Accessible Eyes-Free Multi-Touch Text Entry. Proc. Pervasive '10, 409--426. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Castellucci, S., and MacKenzie, I. S. (2011). Gathering text entry metrics on android devices. Proc. CHI EA '11, 1507--1512. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Fischer, A. R. H., Price, K. J., and Sears, A. Speech-based Text Entry for Mobile Handheld Devices: An analysis of efficacy and error correction techniques for server-based solutions. International Journal of Human-Computer Interaction 19, 3 (2005), 279--304.Google ScholarGoogle ScholarCross RefCross Ref
  9. Fleksy App by Syntellia. http://fleksy.com/Google ScholarGoogle Scholar
  10. Frey, B., Southern, C., and Romero, M. BrailleTouch: Mobile Texting for the Visually Impaired. Proc. UAHCI'11, 19--25. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Google. Voice Search Anywhere. http://www.google.com/insidesearch/features/voicesearch/index-chrome.htmlGoogle ScholarGoogle Scholar
  12. Halverson, C., Horn, D., Karat, C., and Karat, J. The beauty of errors: patterns of error correction in desktop speech systems. IOS Press (1999), 133--140.Google ScholarGoogle Scholar
  13. Kane, S. K., Bigham, J. P. and Wobbrock, J. O. (2008). Slide Rule: Making mobile touch screens accessible to blind people using multitouch interaction techniques. Proc. ASSETS '08, 73--80. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Kane, S. K., Wobbrock, J. O. and Ladner, R. E. (2011). Usable gestures for blind people: understanding preference and performance. Proc. CHI '11, 413--422. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Karat, C.-M., Halverson, C., Horn, D., and Karat, J. Patterns of entry and correction in large vocabulary continuous speech recognition systems. Proc. CHI'99, 568--575. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Kumar, A., Paek, T., and Lee, B. Voice typing: a new speech interaction model for dictation on touchscreen devices. Proc. CHI '12, 2277--2286. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. MacKenzie, I. S. and Zhang, S. X. (1999). The design and evaluation of a high-performance soft keyboard. Proc. CHI '99, 25--31. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Mascetti, S., Bernareggi, C., and Belotti, M. (2011). TypeInBraille: a braille-based typing application for touchscreen devices. Proc. ASSETS '11, 295--296. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Martin, T. B. and Welch, J. R. Practical speech recognizers and some performance effectiveness parameters. In Trends in Speech Recognition. Prentice Hall, Englewood Cliffs, NJ, USA, 1980.Google ScholarGoogle Scholar
  20. Mishra, T., Ljolje, A., Gilbert, Mazin. (2011). Predicting Human Perceived Accuracy of ASR Systems. Proc. INTERSPEECH '11, 1945--1948.Google ScholarGoogle Scholar
  21. Nuance. Dragon Naturally Speaking Software. http://www.nuance.com/dragon/index.htmGoogle ScholarGoogle Scholar
  22. Oliveira, J., Guerreiro, T., Nicolau, H., Jorge, J., and Gonçalves, D. (2011). Blind people and mobile touch-based text-entry: acknowledging the need for different flavors. Proc. ASSETS '11, 179--186. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Oviatt, S. Taming recognition errors with a multimodal interface. Commun. ACM 43, 9 (2000), 45--51. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Pitt, I., and Edwards, A. D. N. (1996). Improving the usability of speech-based interfaces for blind users. Proc. ASSETS '96, 124--130. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Sánchez, J. and Aguayo, F. (2006), Mobile messenger for the blind. Proc. UAAI '06, 369--385 Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Sears, A., Karat, C.-M., Oseitutu, K., Karimullah, A., and Feng, J. Productivity, satisfaction, and interaction strategies of individuals with spinal cord injuries and traditional users interacting with speech recognition software. Universal Access in the Information Society 1, 1 (2001), 4--15.Google ScholarGoogle ScholarCross RefCross Ref
  27. Southern, C., Clawson, J., Frey, B., Abowd, B., and Romero, M. 2012. An evaluation of BrailleTouch: mobile touchscreen text entry for the visually impaired. Proc. MobileHCI '12, 317--326. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Stent, A., Syrdal, A., and Mishra, T. 2011. On the intelligibility of fast synthesized speech for individuals with early-onset blindness. Proc. ASSETS '11, 211--218. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Suhm, B., Myers, B., and Waibel, A. Multi-Modal Error Correction for Speech User Interfaces. ACM TOCHI 8, 1 (2001), 60--98. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Tinwala, H, and MacKenzie, I. S. (2009). Eyes-free text entry on a touchscreen phone. Proc. TIC-STH '09, 83--88.Google ScholarGoogle ScholarCross RefCross Ref
  31. Williams, J. R. (1998). Guidelines for the use of multimedia in instruction, Proceedings of the Human Factors and Ergonomics Society 42nd Annual Meeting, 1447--1451.Google ScholarGoogle ScholarCross RefCross Ref
  32. Wobbrock, J. O. (2007). Measures of text entry performance. In Text Entry Systems: Mobility, Accessibility, Universality, I. S. MacKenzie and K. Tanaka-Ishii (eds.). San Francisco: Morgan Kaufmann, 47--74.Google ScholarGoogle Scholar
  33. Wobbrock, J. O. and Myers, B. A. Analyzing the input stream for character- level errors in unconstrained text entry evaluations. ACM TOCHI. 13, 4 (2006), 458--489. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Yfantidis, G. and Evreinov, G. Adaptive blind interaction technique for touchscreens, Universal Access in the Information Society, 4, 2006, 328--337. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Exploring the use of speech input by blind people on mobile devices

              Recommendations

              Comments

              Login options

              Check if you have access through your login credentials or your institution to get full access on this article.

              Sign in
              • Published in

                cover image ACM Conferences
                ASSETS '13: Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility
                October 2013
                343 pages
                ISBN:9781450324052
                DOI:10.1145/2513383

                Copyright © 2013 ACM

                Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

                Publisher

                Association for Computing Machinery

                New York, NY, United States

                Publication History

                • Published: 21 October 2013

                Permissions

                Request permissions about this article.

                Request Permissions

                Check for updates

                Qualifiers

                • research-article

                Acceptance Rates

                ASSETS '13 Paper Acceptance Rate28of98submissions,29%Overall Acceptance Rate436of1,556submissions,28%

              PDF Format

              View or Download as a PDF file.

              PDF

              eReader

              View online with eReader.

              eReader