skip to main content
research-article

Enabling the blind to see gestures

Published:11 April 2013Publication History
Skip Abstract Section

Abstract

Human discourse is an embodied activity emerging from the embodied imagery and construction of our talk. Gesture and speech are coexpressive, conveying this imagery and meaning simultaneously. Mathematics instruction and discourse typically involve two modes of communication: speech and graphical presentation. Our goal is to assist Individuals who are Blind or Severely Visually Impaired (IBSVI) to access such instruction/communication. We employ a haptic glove interface to furnish the IBSVI with awareness of the deictic gestures performed by the instructor over the graphic in conjunction with speech. We present a series of studies spanning two years where we show how our Haptic Deictic System (HDS) can support learning in inclusive classrooms where IBSVI receive instruction alongside sighted students. We discuss how the introduction of the HDS was advantageous to all parties: IBSVI, instructor, and sighted students. The HDS created more learning opportunities, increasing mutual understanding and promoting greater engagement.

References

  1. Alibali, M. W. 2005. Gesture in spatial cognition: Expressing, communicating, and thinking about spatial information. Spatial Cogn. Comput. Interdiscipl. J. 5, 4, 307--331.Google ScholarGoogle ScholarCross RefCross Ref
  2. Alibali, M. W. and Nathan, M. J. 2007. Teachers' gestures as a means of scaffolding students' understanding: Evidence from an early algebra lesson. In Video Research in the Learning Sciences, R. Goldman, B. B. R. Pea, and S. J. Derry, Eds., Erlbaum, Mahwah, NJ.Google ScholarGoogle Scholar
  3. Archambault, D., Ossmann, R., Gaudy, T., and Miesenberger, K. 2007. Computer games and visually impaired people. Upgrade 8, 2, 43--53.Google ScholarGoogle Scholar
  4. Bach-Yrita, P., Collins, C., Saunders, F., White, B., and Scadden, L. 1969. Vision substitution by tactile image projection. Nature 221, 5184, 963--964.Google ScholarGoogle Scholar
  5. Baear, R., Flexer, R., and Mcmahan, R. K. 2005. Transition models and promising practices. In Transition Planning for Secondary Students with Disabilities. 53--82.Google ScholarGoogle Scholar
  6. Baker, E., Wang, M., and Walberg, H. 1994. The effects of inclusion on learning. Educ. Leader. 52, 4, 32--35.Google ScholarGoogle Scholar
  7. Beattie, G. 2003. Visible Thought: The New Psychology of Body Language. Routledge, London.Google ScholarGoogle Scholar
  8. Brophy, J. and Good, T. 1986. Teacher behavior and student achievement. In Handbook of Research on Teaching 3rd Ed. 328--375.Google ScholarGoogle Scholar
  9. Cancelli, A. 1993. Type of instruction and the relationship of classroom behavior to achievement among learning disabled children. J. Classroom Interact. 28, 1, 13--21.Google ScholarGoogle Scholar
  10. Chen, J. 2007. Flow in games (and everything else). Comm. ACM 50, 4, 31--34. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Christman, R. J. 1979. Sensory Experience. Harper and Row Publishers.Google ScholarGoogle Scholar
  12. Clark, H. and Marshall, C. 2002. Definite reference and mutual knowledge. In Psycholinguistics: Critical Concepts in Psychology. 414.Google ScholarGoogle Scholar
  13. Clark, H. and Wilkes-Gibbs, D. 1990. Referring as a collaborative process. In Intentions in Communication, MIT Press, 463--493.Google ScholarGoogle Scholar
  14. Clark, H. H. 1992. Arenas of Language Use. University of Chicago Press.Google ScholarGoogle Scholar
  15. Clark, H. H. 1996. Using Language. Cambridge University Press, Cambridge, UK.Google ScholarGoogle Scholar
  16. Clark, H. H. and Brennan, S. E. 1991. Grounding in communication. In Perspectives on Socially Shared Cognition, L. B. Resnick, J. M. Levine, and S. D. Teasley, Eds., American Phycological Association, Washignton, DC.Google ScholarGoogle Scholar
  17. Clark, H. H. and Krych, M. 2003. Speaking while monitoring addressees for understanding. J. Mem. Lang. 50, 62--85.Google ScholarGoogle ScholarCross RefCross Ref
  18. Csikszentmihalyi, M. 1975. Play and intrinsic rewards. J. Humanist. Psychol. 15, 3, 41--63.Google ScholarGoogle ScholarCross RefCross Ref
  19. Department of Health. 2007. Certificate of vision impairment: Explanatory notes for consultant ophthalmologists and hospital eye clinic staff. Tech. rep., Social Care Policy & Innovation--Person Involvement and Independent Living, U.S. Department of Health.Google ScholarGoogle Scholar
  20. Dick, T. and Evelyn, K. 1997. Issues and aids for teaching mathematics to the blind. Math. Teacher 90, 5, 344--349.Google ScholarGoogle Scholar
  21. Dourish, P. 2001. Where the Action Is: The Foundations of Embodied Interaction. MIT Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Fang, B., Oliveira, F., and Quek, F. 2010. Using vision based tracking to support real-time graphical instruction for student who have visual impairments. In Proceedings of the 3rd Workshop on Computer Vision Applications for the Visually Impaired (CVAVI'10). 9--14.Google ScholarGoogle Scholar
  23. Fisk, A. and Schneider, A. 1981. Controlled and automatic processing during tasks requiring sustained attention. Hum. Factors 23, 737--750.Google ScholarGoogle ScholarCross RefCross Ref
  24. Fredrick, W. 1977. The use of classroom time in high schools above or below the median reading score. Urban Educ. 11, 4, 459.Google ScholarGoogle ScholarCross RefCross Ref
  25. Gallace, A., Tan, H., and Spence, C. 2007. The body surface as a communication system: The state of the art after 50 years. PRESENCE: Teleop. Virt. Environ. 16, 6, 655--676. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Gilliland, K. and Schlegel, R. 1994. Tactile stimulation of the human head for information display. Hum. Factors: J. Hum. Factors Ergon. Society 36, 4, 700--717.Google ScholarGoogle ScholarCross RefCross Ref
  27. Godthelp, J. and Schumann, J. 1993. Intelligent accelerator: An element of driver support. In Driving Future Vehicles, 265--275.Google ScholarGoogle Scholar
  28. Goldin-Meadow, S. 1999. The role of gesture in communication and thinking. Trends Cogn. Sci. 3, 11, 419--429.Google ScholarGoogle ScholarCross RefCross Ref
  29. Goldin-Meadow, S. 2003. Hearing Gesture: How Our Hands Help Us Think. Belknap Press.Google ScholarGoogle Scholar
  30. Harrison, S., Tatar, D., and Sengers, P. 2007. The three paradigms of hci. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems.Google ScholarGoogle Scholar
  31. Hinrichs, E. and Polanyi, L. 1986. Pointing the way: A unified treatment of referential gesture in interactive discourse. In Papers from the Parasession on Pragmatics and Grammatical Theory, Chicago Linguistics Society 22nd Meeting. 71--78.Google ScholarGoogle Scholar
  32. Ho, C., Tan, H., and Spence, C. 2005. Using spatial vibrotactile cues to direct a driver's visual attention. Transport. Res. F-Traffic Psychol. Behav. 8, 397--412.Google ScholarGoogle ScholarCross RefCross Ref
  33. Hodzic, A., Veit, R., Karim, A., Erb, M., and Godde, B. 2004. Improvement and decline in tactile discrimination behavior after cortical plasticity induced by passive tactile coactivation. J. Neurosci. 24, 2, 442.Google ScholarGoogle ScholarCross RefCross Ref
  34. Iglesias, R., Casado, S., Gutierrez, T., Barbero, J., Avizzano, C., Marcheschi, S., Bergamasco, M., Labein, F., and Derio, S. 2004. Computer graphics access for blind people through a haptic and audio virtual environment. In Proceedings of the the 3rd IEEE International Workshop on Haptic, Audio and Visual Environments and Their Applications (HAVE'04). 13--18.Google ScholarGoogle Scholar
  35. Inman, D., Loge, K., and Cram, A. 2000. Teaching orientation and mobility skills to blind children using computer generated 3-d sound environments. In Proceedings of the International Community for Auditory Display.Google ScholarGoogle Scholar
  36. Inman, D., Peaks, J., Loge, K., Chen, V., and Ferrington, G. 1994. Virtual reality training program for motorized wheelchair operation. Tech. rep., California State University, Northridge Center on Disabilities, Los Angeles.Google ScholarGoogle Scholar
  37. International Council of Ophthalmology. 2002. International standards: Visual standards—Aspects and ranges of vision loss with emphasis on population surveys. Tech. rep., 29th International Congress of Ophthalmology.Google ScholarGoogle Scholar
  38. Iverson, J. and Goldin-Meadow, S. 1998. Why people gesture as they speak. Nature 396, 228.Google ScholarGoogle ScholarCross RefCross Ref
  39. Janssen, W. and Nilsson, L. 1993. Behavioural effects of driver support. In Driving Future Vehicles, 147--155.Google ScholarGoogle Scholar
  40. Johansson, A. and Linde, J. 1999. Using simple force feedback mechanisms as haptic visualizationtools. In Proceedings of the 16th IEEE Instrumentation and Measurement Technology Conference (IMTC'99). Vol. 2.Google ScholarGoogle Scholar
  41. Johnson, D. and Wiles, J. 2003. Effective affective user interface design in games. Ergon. 46, 13, 1332--1345.Google ScholarGoogle ScholarCross RefCross Ref
  42. Jones, L., Lockyer, B., and Piateski, E. 2006. Tactile display and vibrotactile pattern recognition on the torso. Adv. Robotics 20, 12, 1359--1374.Google ScholarGoogle ScholarCross RefCross Ref
  43. Kaczmarek, K. A. and Webster, J. 1991. Electrotactile and vibrotactile displays for sensory substitution systems. IEEE Trans. Biomed. Engin. 38, 1--16.Google ScholarGoogle ScholarCross RefCross Ref
  44. Kendon, A. 2004. Gesture: Visible Action as Utterance. Cambridge University Press, Cambridge, U.K.Google ScholarGoogle ScholarCross RefCross Ref
  45. Kraut, R., Gergle, D., and Fussell, S. 2002. The use of visual information in shared visual spaces: Informing the development of virtual co-presence. In Proceedings of the ACM Conference on Computer Supported Cooperative Work. ACM Press, New York, 31--40. Google ScholarGoogle ScholarDigital LibraryDigital Library
  46. Landau, B., Spelke, E., and Gleitman, H. 1984. Spatial knowledge in a young blind child. Cogn. 16, 225--260.Google ScholarGoogle ScholarCross RefCross Ref
  47. Lecuyer, A., Moubuchon, P., Megard, C., Perret, J., Andriot, C., and Colinot, J. 2003. Homere: A multimodal system for visually impaired people to explore virtual environments. In Proceedings of the IEEE Annual International Symposium on Virtual Reality. 251--258. Google ScholarGoogle ScholarDigital LibraryDigital Library
  48. Lee, J., Hoffman, J., and Hayes, E. 2004. Collision warning design to mitigate driver distraction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM Press, New York, 65--72. Google ScholarGoogle ScholarDigital LibraryDigital Library
  49. Lloyd, D., Merat, N., Mcglone, F., and Spence, C. 2003. Crossmodal links between audition and touch in covert endogenous spatial attention. Percept. Psychophys. 65, 6, 901--924.Google ScholarGoogle ScholarCross RefCross Ref
  50. Lohse, G. L. 1997. Models of graphical perception. In Handbook of Human-Computer Interaction.Google ScholarGoogle Scholar
  51. Manshad, M. and Manshad, A. 2008. Multimodal vision glove for touchscreens. In Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility. ACM New York, 251--252. Google ScholarGoogle ScholarDigital LibraryDigital Library
  52. Mcgehee, D., Raby, M., Lee, J., and Nourse, G. 2001. Final design and operating characteristics of a snowplow lane awareness system. In Proceedings of the 80th Annual Meeting of the Transportation Research Board. 7--11.Google ScholarGoogle Scholar
  53. Mcgookin, D. and Brewster, S. 2007. An initial investigation into non-visual computer supported collaboration. In CHI'07 Extended Abstracts on Human Factors in Computing Systems. ACM Press, New York, 2573--2578. Google ScholarGoogle ScholarDigital LibraryDigital Library
  54. Mcneill, D. 1992. Hand and Mind: What Gestures Reveal about Thought. University of Chicago Press, Chicago, IL.Google ScholarGoogle Scholar
  55. Mcneill, D. 2000. Language and Gesture. Cambridge University Press.Google ScholarGoogle Scholar
  56. Mcneill, D. 2005. Gesture and Thought. University of Chicago Press.Google ScholarGoogle Scholar
  57. Mcneill, D. 2006. Gesture, gaze, and ground. In Machine Learning for Multimodal Interaction. Lecture Notes in Computer Science, vol. 3869, Springer, 1--14. Google ScholarGoogle ScholarDigital LibraryDigital Library
  58. Millar, S. 1985. Movement cues and body orientation in recall of locations by blind and sighted children. Quart. J. Psychol. 257--279.Google ScholarGoogle Scholar
  59. Monk, A. 2003. Common ground in electronically mediated communication. In HCI Models, Theories, and Frameworks: Toward a Multidisciplinary Science, J. Carroll, Ed. Morgan Kaufman, San Francisco, 265--286.Google ScholarGoogle Scholar
  60. Montgomery, K. J., Isenberg, N., and Haxby, J. V. 2007. Communicative hand gestures and object-directed hand movements activated the mirror neuron system. Soc. Cogn. Affect. Neurosci. 2, 2, 114--122.Google ScholarGoogle ScholarCross RefCross Ref
  61. Mynatt, E. and Wber, G. 1994. Nonvisual presentation of graphical user interfaces. In Proceedings of the ACM Human Factors in Computing Systems Conference (CHI'94). ACM Press, New York. Google ScholarGoogle ScholarDigital LibraryDigital Library
  62. Oberti V. Clementon. 1993. Court decision. 995 F.2D 1204, (3rd Cir. 1993).Google ScholarGoogle Scholar
  63. Oliveira, F. 2010. Enabling the blind to see gestures. Ph.D. thesis, Virginia Polytechnic Institute and State University.Google ScholarGoogle Scholar
  64. Oliveira, F., Cowan, H., Fang, B., and Quek, F. 2010. Fun to develop embodied skill: How games help the blind to understand pointing. In Proceedings of the 3rd International Conference on Pervasive Technologies Related to Assistive Environments (PETRA). ACM Press, New York, 1--8. Google ScholarGoogle ScholarDigital LibraryDigital Library
  65. Oliveira, F. and Quek, F. 2008. A multimodal communication with a haptic glove: On the fusion of speech and deictic over a raised line drawing. In 1st International Conference on Pervasive Technologies Related to Assistive Environments (PETRA). Google ScholarGoogle ScholarDigital LibraryDigital Library
  66. Oliveira, F., Quek, F., Cowan, H., and Fang, B. 2011. The haptic deictic system - hds: Bringing blind students to mainstream classrooms. IEEE Trans. Haptics PP, 99, 1--1. Google ScholarGoogle ScholarDigital LibraryDigital Library
  67. Penrose, R. 1989. The Emperor's New Mind. Oxford University Press, New York.Google ScholarGoogle Scholar
  68. Poizner, H., Klima, E., and Bellugi, U. 2000. What the Hands Reveal About the Brain. MIT Press.Google ScholarGoogle Scholar
  69. Prablanc, C., Echallier, J., Komilis, E., and Jeannerod, M. 1979. Optimal response of eye and hand motor systems in pointing at a visual target. Spatio-Temporal characteristics of eye and hand movements and their relationships when varying the amount of visual information. Biol. Cybern. 35, 2, 113--124.Google ScholarGoogle ScholarDigital LibraryDigital Library
  70. Quek, F. 2004. The catchment feature model: A device for multimodal fusion and a bridge between signal and sense. EURASIP J. Appl. Signal Process. 1, 1619--1636. Google ScholarGoogle ScholarDigital LibraryDigital Library
  71. Quek, F., Mcneill, D., Ansari, R., Ma, X., Bryll, R., Duncan, S., and Mccullough, K.-E. 2002. Multimodal human discourse: Gesture and speech. ACM Trans. Comput.-Hum. Interact. 9, 3, 171--193. Google ScholarGoogle ScholarDigital LibraryDigital Library
  72. Raisamo, R., Patomaki, S., Hasu, M., and Pasto, V. 2007. Design and evaluation of a tactile memory game for visually impaired children. Interact. Comput. 19, 2, 196--205. Google ScholarGoogle ScholarDigital LibraryDigital Library
  73. Rizzolatti, G., Fadiga, L., Gallese, V., and Fogassi, L. 1996. Premotor cortex and the recognition of motor actions. Cogn. Brain Res. 3, 2, 131--141.Google ScholarGoogle ScholarCross RefCross Ref
  74. Rose, R., Quek, F., and Shi, Y. 2004. MacVisSTA: A system for multimodal analysis. In Proceedings of the 6th International Conference on Multimodal Interfaces. ACM Press, New York, 259--264. Google ScholarGoogle ScholarDigital LibraryDigital Library
  75. Roth, P., Petrucci, L., Pun, T., and Assimacopoulos, A. 1999. Auditory browser for blind and visually impaired users. In CHI EA'99: CHI'99 Extended Abstracts on Human Factors in Computing Systems. ACM Press, New York, 218--219. Google ScholarGoogle ScholarDigital LibraryDigital Library
  76. Sallnas, E., Bjerstedt-Blom, K., Winberg, F., and Eklundh, K. 2006. Navigation and control in haptic applications shared by blind and sighted users. In Proceedings of the 1st International Conference on Haptic and Audio Interaction Design (HAID'06). 68--80. Google ScholarGoogle ScholarDigital LibraryDigital Library
  77. Sepchat, A., Monmarche, N., Slimane, M., and Archambault, D. 2006. Semi automatic generator of tactile video games for visually impaired children. In Proceedings of the 10th International Conference on Computers Helping People with Special Needs. Lecture Notes in Computer Science, vol. 4061, Springer, 372--379. Google ScholarGoogle ScholarDigital LibraryDigital Library
  78. Sherrick, C. 1985. Touch as a communicative sense: Introduction. J. Acoust. Soc. Amer. 77, 218.Google ScholarGoogle ScholarCross RefCross Ref
  79. Sjostrom, C. 2001. Designing haptic computer interfaces for blind people. In Proceedings of the 6th International Symposium on Signal Processing and its Applications (ISSPA'01). 1--4.Google ScholarGoogle ScholarCross RefCross Ref
  80. Spence, C., Nicholls, M., and Driver, J. 2001. The cost of expecting events in the wrong sensory modality. Percept. Psychophys. 63, 2, 330--336.Google ScholarGoogle ScholarCross RefCross Ref
  81. Spence, C., Shore, D., and Klein, R. 2001. Multisensory prior entry. J. Exp. Psychol. Gen. 130, 4, 799--832.Google ScholarGoogle ScholarCross RefCross Ref
  82. Splindler, R. 2005. Teaching mathematics to a student who is blind. Teach. Math. Appl. 25, 3.Google ScholarGoogle Scholar
  83. Staub, D. and Peck, C. 1994. What are the outcomes for nondisabled students? Educ. Leader. 52, 4, 36--40.Google ScholarGoogle Scholar
  84. Supalo, C. 2005. Techniques to enhance instructors' teaching effectiveness with chemistry students who are blind or visually impaired. J. Chem. Educ. 82, 10, 1513.Google ScholarGoogle ScholarCross RefCross Ref
  85. Sweetser, P. and Wyeth, P. 2005. Gameflow: A model for evaluating player enjoyment in games. Comput. Entertain. 3, 3--3. Google ScholarGoogle ScholarDigital LibraryDigital Library
  86. Tatar, D. G., Foster, G., and Bobrow, D. 1991. Designing for conversation: Lessons from cognoter. Int. J. Man-Mach. Stud. 34, 185--209. Google ScholarGoogle ScholarDigital LibraryDigital Library
  87. Universal Low Vision Aids Inc. 2009a. Cost of 8.5 × 11 braille paper. http://www.ulva.com/Online-Store/Braille-Paper/tractor8x11.htm.Google ScholarGoogle Scholar
  88. Universal Low Vision Aids Inc. 2009b. Emprint. http://www.ulva.com/Online-Store/Braille-Embossers/emprint.htm.Google ScholarGoogle Scholar
  89. U.S.C. 1997. Individuals with disabilities education acts amendments of 1997. United States Congress.Google ScholarGoogle Scholar
  90. U.S.C. 2002. No child left behind act of 2001. United States Congress.Google ScholarGoogle Scholar
  91. Vitense, H., Jacko, J., and Emery, V. 2003. Multimodal feedback: An assessment of performance and mental workload. Ergon. 46, 1--3, 68--87.Google ScholarGoogle ScholarCross RefCross Ref
  92. Wall, S. and Brewster, S. 2006. Sensory substitution using tactile pin arrays: Human factors, technology and applications. Signal Process. 86, 12, 3674--3695. Google ScholarGoogle ScholarDigital LibraryDigital Library
  93. Wang, Q. and Hayward, V. 2006. Compact, portable, modular, high-performance, distributed tactile transducer device based on lateral skin deformation. In Proceedings of the 14th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. 67--72. Google ScholarGoogle ScholarDigital LibraryDigital Library
  94. Wells, L. and Landau, S. 2003. Merging of tactile sensory input and audio data by means of the talking tactile tablet. In Proceedings of the Eurohaptics Conference. 414--418.Google ScholarGoogle Scholar
  95. WHO. 2007. Blindness and low vision. Tech. rep. H54, World Health Organization.Google ScholarGoogle Scholar
  96. WHO. 2010. Fact sheet visual impairment and blindness. Tech. rep. N282, World Health Organization.Google ScholarGoogle Scholar
  97. Wickens, C. and Hollands, J. 2001. Engineering Psychology and Human Performance. Prentice Hall.Google ScholarGoogle Scholar
  98. Williams, J. M. 2002. Nationwide shortage of teachers for blind students must be corrected. In National Federation of the Blind: Advocates for Equality, Canadian Blind Monitor.Google ScholarGoogle Scholar
  99. Winberg, F. and Bowers, J. 2004. Assembling the senses: Towards the design of cooperative interfaces for visually impaired users. In Proceedings of the ACM Conference on Computer Supported Cooperative Work (CSCW). ACM Press, New York, 332--341. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Enabling the blind to see gestures

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in

    Full Access

    • Published in

      cover image ACM Transactions on Computer-Human Interaction
      ACM Transactions on Computer-Human Interaction  Volume 20, Issue 1
      Special issue on the theory and practice of embodied interaction in HCI and interaction design
      March 2013
      171 pages
      ISSN:1073-0516
      EISSN:1557-7325
      DOI:10.1145/2442106
      Issue’s Table of Contents

      Copyright © 2013 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 11 April 2013
      • Accepted: 1 November 2012
      • Revised: 1 September 2012
      • Received: 1 October 2011
      Published in tochi Volume 20, Issue 1

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article
      • Research
      • Refereed

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader