skip to main content
10.1145/3132525.3132545acmconferencesArticle/Chapter ViewAbstractPublication PagesassetsConference Proceedingsconference-collections
research-article
Public Access

Virtual Navigation for Blind People: Building Sequential Representations of the Real-World

Published:19 October 2017Publication History

ABSTRACT

When preparing to visit new locations, sighted people often look at maps to build an a priori mental representation of the environment as a sequence of step-by-step actions and points of interest (POIs), e.g., turn right after the coffee shop. Based on this observation, we would like to understand if building the same type of sequential representation, prior to navigating in a new location, is helpful for people with visual impairments (VI). In particular, our goal is to understand how the simultaneous interplay between turn-by-turn navigation instructions and the relevant POIs in the route can aid the creation of a memorable sequential representation of the world. To this end, we present two smartphone-based virtual navigation interfaces: VirtualLeap, which allows the user to jump through a sequence of street intersection labels, turn-by-turn instructions and POIs along the route; and VirtualWalk, which simulates variable speed step-by-step walking using audio effects, whilst conveying similar route information. In a user study with 14 VI participants, most were able to create and maintain an accurate mental representation of both the sequential structure of the route and the approximate locations of the POIs. While both virtual navigation modalities resulted in similar spatial understanding, results suggests that each method is useful in different interaction contexts.

References

  1. Blindsquare ios application. http://blindsquare.com/. Accessed: 2017-05-04.Google ScholarGoogle Scholar
  2. D. Ahmetovic, C. Gleason, C. Ruan, K. Kitani, H. Takagi, and C. Asakawa. Navcog: a navigational cognitive assistant for the blind. In Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services, pages 90-99. ACM, 2016. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. N. Banovic, R. L. Franz, K. N. Truong, J. Mankoff, and A. K. Dey. Uncovering information needs for independent spatial learning for users who are visually impaired. In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility, page 24. ACM, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. J. R. Blum, M. Bouchard, and J. R. Cooperstock. What's around me? spatialized audio augmented reality for blind users with a smartphone. In International Conference on Mobile and Ubiquitous Systems: Computing, Networking, and Services, pages 49-62. Springer, 2011.Google ScholarGoogle Scholar
  5. A. M. Brock, P. Truillet, B. Oriola, D. Picard, and C. Jouffrais. Interactivity improves usability of geographic maps for visually impaired people. Human-Computer Interaction, 30(2):156-194, 2015. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. J. L. Campbell, C. Quincy, J. Osserman, and O. K. Pedersen. Coding in-depth semistructured interviews: Problems of unitization and intercoder reliability and agreement. Sociological Methods & Research, 42(3):294-320, 2013.Google ScholarGoogle ScholarCross RefCross Ref
  7. E. C. Connors, E. R. Chrastil, J. Sánchez, and L. B. Merabet. Virtual environments for the transfer of navigation skills in the blind: a comparison of directed instruction vs. video game based learning approaches. Frontiers in human neuroscience, 8:223, 2014.Google ScholarGoogle Scholar
  8. R. G. Golledge. Wayfinding behavior: Cognitive mapping and other spatial processes. JHU press, 1999.Google ScholarGoogle Scholar
  9. W. H. Gomaa and A. A. Fahmy. A survey of text similarity approaches. International Journal of Computer Applications, 68(13), 2013.Google ScholarGoogle ScholarCross RefCross Ref
  10. T. Guerreiro, K. Montague, J. Guerreiro, R. Nunes, H. Nicolau, and D. J. Goncalves. Blind people interacting with large touch surfaces: Strategies for one-handed and two-handed exploration. In Proceedings of the 2015 International Conference on Interactive Tabletops & Surfaces, pages 25-34. ACM, 2015. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. H. Kacorri, S. Mascetti, A. Gerino, D. Ahmetovic, H. Takagi, and C. Asakawa. Supporting orientation of people with visual impairment: Analysis of large scale usage data. In Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility, pages 151-159. ACM, 2016. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. S. K. Kane, M. R. Morris, A. Z. Perkins, D. Wigdor, R. E. Ladner, and J. O. Wobbrock. Access overlays: improving non-visual access to large touch screens for blind users. In Proceedings of the 24th annual ACM symposium on User interface software and technology, pages 273-282. ACM, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. B. F. Katz, S. Kammoun, G. Parseihian, O. Gutierrez, A. Brilhault, M. Auvray, P. Truillet, M. Denis, S. Thorpe, and C. Jouffrais. Navig: augmented reality guidance system for the visually impaired. Virtual Reality, 16(4):253-269, 2012.Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. R. Kitchin and R. Jacobson. Techniques to collect and analyze the cognitive map knowledge of persons with visual impairment or blindness: issues of validity. Journal of Visual Impairment and Blindness, 91(4):360-376, 1997.Google ScholarGoogle ScholarCross RefCross Ref
  15. R. L. Klatzky, J. M. Loomis, A. C. Beall, S. S. Chance, and R. G. Golledge. Spatial updating of self-position and orientation during real, imagined, and virtual locomotion. Psychological science, 9(4):293-298, 1998.Google ScholarGoogle ScholarCross RefCross Ref
  16. R. L. Klatzky, J. M. Loomis, R. G. Golledge, J. G. Cicinelli, S. Doherty, and J. W. Pellegrino. Acquisition of route and survey knowledge in the absence of vision. Journal of motor behavior, 22(1):19-43, 1990.Google ScholarGoogle Scholar
  17. V. I. Levenshtein. Binary codes capable of correcting deletions, insertions, and reversals. In Soviet physics doklady, volume 10, pages 707-710, 1966.Google ScholarGoogle Scholar
  18. J. M. Loomis, R. G. Golledge, and R. L. Klatzky. Navigation system for the blind: Auditory display modes and guidance. Presence: Teleoperators and Virtual Environments, 7(2):193-203, 1998. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. M. Lumbreras and J. Sanchez. Interactive 3d sound hyperstories for blind children. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems, pages 318-325. ACM, 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. S. Mascetti, D. Ahmetovic, A. Gerino, and C. Bernareggi. Zebrarecognizer: Pedestrian crossing recognition for people with visual impairment or blindness. Pattern Recognition, 2016. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. S. Millar. Understanding and representing space: Theory and evidence from studies with blind and sighted children. Clarendon Press/Oxford University Press, 1994.Google ScholarGoogle Scholar
  22. D. R. Montello. Spatial cognition. In N. J. Smelser and B. Baltes, editors, International Encyclopedia of the Social and Behavioral Sciences, pages 7-14771. 2001.Google ScholarGoogle Scholar
  23. C. Morency, M. Trepanier, and M. Demers. Walking to transit: an unexpected source of physical activity. Transport Policy, 18(6):800-806, 2011.Google ScholarGoogle ScholarCross RefCross Ref
  24. H. Nicolau, J. Jorge, and T. Guerreiro. Blobby: how to guide a blind person. In CHI'09 Extended Abstracts on Human Factors in Computing Systems, pages 3601-3606. ACM, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. R. Nordahl, S. Serafin, N. C. Nilsson, and L. Turchet. Enhancing realism in virtual environments by simulating the audio-haptic sensation of walking on ground surfaces. In Virtual Reality Short Papers and Posters (VRW), 2012 IEEE, pages 73-74. IEEE, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. S. A. Panëels, A. Olmos, J. R. Blum, and J. R. Cooperstock. Listen to it yourself!: evaluating usability of what's around me? for the blind. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 2107-2116. ACM, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. R. Passini, G. Proulx, and C. Rainville. The spatio-cognitive abilities of the visually impaired population. Environment and Behavior, 22(1):91-118, 1990.Google ScholarGoogle ScholarCross RefCross Ref
  28. L. Picinali, A. Afonso, M. Denis, and B. F. Katz. Exploration of architectural spaces by blind people using auditory virtual reality for the construction of spatial knowledge. International Journal of Human-Computer Studies, 72(4):393-407, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. J. J. Rieser. Access to knowledge of spatial structure at novel points of observation. Journal of Experimental Psychology: Learning, Memory, and Cognition, 15(6):1157, 1989.Google ScholarGoogle Scholar
  30. J. Rowell and S. Ungar. Feeling our way: tactile map user requirements-a survey. In International Cartographic Conference, La Coruna, 2005.Google ScholarGoogle Scholar
  31. J. Su, A. Rosenzweig, A. Goel, E. de Lara, and K. N. Truong. Timbremap: enabling the visually-impaired to use maps on touch-enabled devices. In Proceedings of the 12th international conference on Human computer interaction with mobile devices and services, pages 17-26. ACM, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. C. Thinus-Blanc and F. Gaunet. Representation of space in blind persons: vision as a spatial sense? Psychological bulletin, 121(1):20, 1997.Google ScholarGoogle Scholar
  33. W. R. Wiener, R. L. Welsh, and B. B. Blasch. Foundations of orientation and mobility, volume 1. American Foundation for the Blind, 2010.Google ScholarGoogle Scholar
  34. R. Yang, S. Park, S. R. Mishra, Z. Hong, C. Newsom, H. Joo, E. Hofer, and M. W. Newman. Supporting spatial awareness and independent wayfinding for pedestrians with visual impairments. In The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility, pages 27-34. ACM, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. K. Yatani, N. Banovic, and K. Truong. Spacesense: representing geographical information to visually impaired people using spatial tactile feedback. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 415-424. ACM, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Virtual Navigation for Blind People: Building Sequential Representations of the Real-World

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        ASSETS '17: Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility
        October 2017
        450 pages
        ISBN:9781450349260
        DOI:10.1145/3132525

        Copyright © 2017 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 19 October 2017

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        ASSETS '17 Paper Acceptance Rate28of126submissions,22%Overall Acceptance Rate436of1,556submissions,28%

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader