Skip to main content
Erschienen in: Universal Access in the Information Society 4/2020

04.11.2019 | Long Paper

Understanding visually impaired people’s experiences of social signal perception in face-to-face communication

verfasst von: Shi Qiu, Pengcheng An, Jun Hu, Ting Han, Matthias Rauterberg

Erschienen in: Universal Access in the Information Society | Ausgabe 4/2020

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Social signals (e.g., facial expression, gestures) are important in social interactions. Most of them are visual cues, which are hardly accessible for visually impaired people, causing difficulties in their daily living. In human–computer interaction (HCI), assistive systems for social interactions are getting increasing attention due to related technological advancements. Yet, there is still lack of a comprehensive and vivid understanding of visually impaired people’s social signal perception to broadly identify their needs in face-to-face communication. To fill this gap, we conducted in-depth interviews to study the lived experiences of 20 visually impaired participants. We analyzed a rich set of qualitative empirical data based on a comprehensive taxonomy of social signals, using a standard qualitative content analysis method. Our results revealed a set of vivid examples and an overview of visually impaired people’s lived experiences regarding social signals, including both their capabilities and limitations. As reported, the participants perceived social signals through their compensatory modalities such as hearing, touch, smell, or obstacle sense. However, their perception of social signals is generally with low resolution and limited by certain environmental factors (e.g., crowdedness, or noise level of the surrounding). Interestingly, sight was still importantly relied on by low-vision participants in social signal perception (e.g., rough postures and gestures). Besides, the participants experienced difficulties in sensing others’ subtle emotional states which are often revealed by nuanced behaviors (e.g., a smile). Based on rich empirical findings, we propose a set of design implications to inform future-related HCI works aimed at supporting visually impaired users’ social signal perception.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
2.
Zurück zum Zitat Vinciarelli, A., Pantic, M., Bourlard, H.: Social signal processing: survey of an emerging domain. Image Vis. Comput. 27(12), 1743–1759 (2009) Vinciarelli, A., Pantic, M., Bourlard, H.: Social signal processing: survey of an emerging domain. Image Vis. Comput. 27(12), 1743–1759 (2009)
3.
Zurück zum Zitat Van Hasselt, V.B.: Social adaptation in the blind. Clin. Psychol. Rev. 3(1), 87–102 (1983) Van Hasselt, V.B.: Social adaptation in the blind. Clin. Psychol. Rev. 3(1), 87–102 (1983)
4.
Zurück zum Zitat Goharrizi, Z.E.: Blindness and Initiating Communication. University of Oslo, Oslo (2010) Goharrizi, Z.E.: Blindness and Initiating Communication. University of Oslo, Oslo (2010)
5.
Zurück zum Zitat Griffin, E.A.: A First Look at Communication Theory. McGraw-Hill, New York (2012) Griffin, E.A.: A First Look at Communication Theory. McGraw-Hill, New York (2012)
6.
Zurück zum Zitat Naraine, M.D., Lindsay, P.H.: Social inclusion of employees who are blind or low vision. Disabil. Soc. 26(4), 389–403 (2011) Naraine, M.D., Lindsay, P.H.: Social inclusion of employees who are blind or low vision. Disabil. Soc. 26(4), 389–403 (2011)
7.
Zurück zum Zitat Kemp, N.J., Rutter, D.R.: Social interaction in blind people: an experimental analysis. Hum. Relat. 39(3), 195–210 (1986) Kemp, N.J., Rutter, D.R.: Social interaction in blind people: an experimental analysis. Hum. Relat. 39(3), 195–210 (1986)
8.
Zurück zum Zitat Baumeister, R.F., Leary, M.R.: The need to belong: desire for interpersonal attachments as a fundamental human motivation. Psychol. Bull. 117(3), 497–529 (1995) Baumeister, R.F., Leary, M.R.: The need to belong: desire for interpersonal attachments as a fundamental human motivation. Psychol. Bull. 117(3), 497–529 (1995)
9.
Zurück zum Zitat Maslow, A.H.: Personality and Motivation. Harper, New York (1954) Maslow, A.H.: Personality and Motivation. Harper, New York (1954)
10.
Zurück zum Zitat Brock, M., Kristensson, P. O.: Supporting blind navigation using depth sensing and sonification. In: Proceedings of the 2013 ACM Conference on Pervasive and Ubiquitous Computing Adjunct Publication, pp. 255–258. ACM (2013) Brock, M., Kristensson, P. O.: Supporting blind navigation using depth sensing and sonification. In: Proceedings of the 2013 ACM Conference on Pervasive and Ubiquitous Computing Adjunct Publication, pp. 255–258. ACM (2013)
11.
Zurück zum Zitat Galioto, G., Tinnirello, I., Croce, D., Inderst, F., Pascucci, F., Giarré, L.: Sensor fusion localization and navigation for visually impaired people. In: 2018 European Control Conference (ECC), pp. 3191–3196. IEEE (2018) Galioto, G., Tinnirello, I., Croce, D., Inderst, F., Pascucci, F., Giarré, L.: Sensor fusion localization and navigation for visually impaired people. In: 2018 European Control Conference (ECC), pp. 3191–3196. IEEE (2018)
12.
Zurück zum Zitat Botzer, A., Shvalb, N.: Using sound feedback to help blind people navigate. In: Proceedings of the 36th European Conference on Cognitive Ergonomics, Article 23, p. 3. ACM (2018) Botzer, A., Shvalb, N.: Using sound feedback to help blind people navigate. In: Proceedings of the 36th European Conference on Cognitive Ergonomics, Article 23, p. 3. ACM (2018)
13.
Zurück zum Zitat Yusoh, S. M. N. S., Nomura, Y., Kokubo, N., Sugiura, T., Matsui, H., Kato, N.: Dual mode fingertip guiding manipulator for blind persons enabling passive/active line-drawing explorations. In: International Conference on Computers for Handicapped Persons, pp. 851–858. Springer, Berlin (2008) Yusoh, S. M. N. S., Nomura, Y., Kokubo, N., Sugiura, T., Matsui, H., Kato, N.: Dual mode fingertip guiding manipulator for blind persons enabling passive/active line-drawing explorations. In: International Conference on Computers for Handicapped Persons, pp. 851–858. Springer, Berlin (2008)
14.
Zurück zum Zitat Goncu, C., Marriott, K.: GraCALC: an accessible graphing calculator. In: Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility, pp. 311–312. ACM (2015) Goncu, C., Marriott, K.: GraCALC: an accessible graphing calculator. In: Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility, pp. 311–312. ACM (2015)
15.
Zurück zum Zitat Prescher, D., Weber, G., Spindler, M.: A tactile windowing system for blind users. In: Proceedings of the 12th international ACM SIGACCESS conference on Computers and accessibility, pp. 91–98. ACM (2010) Prescher, D., Weber, G., Spindler, M.: A tactile windowing system for blind users. In: Proceedings of the 12th international ACM SIGACCESS conference on Computers and accessibility, pp. 91–98. ACM (2010)
16.
Zurück zum Zitat Milne, L. R., Bennett, C. L., Ladner, R. E., Azenkot, S.: BraillePlay: educational smartphone games for blind children. In Proceedings of the 16th international ACM SIGACCESS conference on Computers & accessibility, pp. 137–144. ACM (2014) Milne, L. R., Bennett, C. L., Ladner, R. E., Azenkot, S.: BraillePlay: educational smartphone games for blind children. In Proceedings of the 16th international ACM SIGACCESS conference on Computers & accessibility, pp. 137–144. ACM (2014)
17.
Zurück zum Zitat Shinohara, K. Wobbrock, J. O.: In the shadow of misperception: assistive technology use and social interactions. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 705–714. ACM (2011) Shinohara, K. Wobbrock, J. O.: In the shadow of misperception: assistive technology use and social interactions. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 705–714. ACM (2011)
18.
Zurück zum Zitat Neto, L.B., Grijalva, F., Maike, V.R.M.L., Martini, L.C., Florencio, D., Baranauskas, M.C.C., Rocha, A., Goldenstein, S.: A kinect-based wearable face recognition system to aid visually impaired users. IEEE Trans. Hum. Mach. Syst. 47(1), 52–64 (2017) Neto, L.B., Grijalva, F., Maike, V.R.M.L., Martini, L.C., Florencio, D., Baranauskas, M.C.C., Rocha, A., Goldenstein, S.: A kinect-based wearable face recognition system to aid visually impaired users. IEEE Trans. Hum. Mach. Syst. 47(1), 52–64 (2017)
19.
Zurück zum Zitat Astler, D. et al.: Increased accessibility to nonverbal communication through facial and expression recognition technologies for blind/visually impaired subjects. In: The Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 259–260. ACM (2011) Astler, D. et al.: Increased accessibility to nonverbal communication through facial and expression recognition technologies for blind/visually impaired subjects. In: The Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 259–260. ACM (2011)
20.
Zurück zum Zitat Yin, R.K.: Case Study Research And Applications: Design and Methods. Sage Publications, Thousand Oaks (2017) Yin, R.K.: Case Study Research And Applications: Design and Methods. Sage Publications, Thousand Oaks (2017)
21.
Zurück zum Zitat Sears, A., Hanson, V.L.: Representing users in accessibility research. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2235–2238. ACM (2011) Sears, A., Hanson, V.L.: Representing users in accessibility research. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2235–2238. ACM (2011)
22.
Zurück zum Zitat Hsieh, H.-F., Shannon, S.E.: Three approaches to qualitative content analysis. Qual. Health Res. 15(9), 1277–1288 (2005) Hsieh, H.-F., Shannon, S.E.: Three approaches to qualitative content analysis. Qual. Health Res. 15(9), 1277–1288 (2005)
23.
Zurück zum Zitat Knapp, M., Hall, J., Horgan, T.: Nonverbal Communication in Human Interaction, 8th edn. Wadsworth Cengage Learning, Boston (2014) Knapp, M., Hall, J., Horgan, T.: Nonverbal Communication in Human Interaction, 8th edn. Wadsworth Cengage Learning, Boston (2014)
24.
Zurück zum Zitat Borkenau, P., Mauer, N., Riemann, R., Spinath, F.M., Angleitner, A.: Thin slices of behavior as cues of personality and intelligence. J. Pers. Soc. Psychol. 86(4), 599–614 (2004) Borkenau, P., Mauer, N., Riemann, R., Spinath, F.M., Angleitner, A.: Thin slices of behavior as cues of personality and intelligence. J. Pers. Soc. Psychol. 86(4), 599–614 (2004)
25.
Zurück zum Zitat Kleck, R.E., Nuessle, W.: Congruence between the indicative and communicative functions of eye contact in interpersonal relations. Br. J. Soc. Clin. Psychol. 7(4), 241–246 (1968) Kleck, R.E., Nuessle, W.: Congruence between the indicative and communicative functions of eye contact in interpersonal relations. Br. J. Soc. Clin. Psychol. 7(4), 241–246 (1968)
26.
Zurück zum Zitat Cook, M., Smith, J.M.C.: The role of gaze in impression formation. Br. J. Soc. Clin. Psychol. 14(1), 19–25 (1975) Cook, M., Smith, J.M.C.: The role of gaze in impression formation. Br. J. Soc. Clin. Psychol. 14(1), 19–25 (1975)
27.
Zurück zum Zitat Arndt, H., Janney, R.W.: InterGrammar: Toward an Integrative Model of Verbal, Prosodic and Kinesic Choices in Speech. Walter de Gruyter, Berlin (2011) Arndt, H., Janney, R.W.: InterGrammar: Toward an Integrative Model of Verbal, Prosodic and Kinesic Choices in Speech. Walter de Gruyter, Berlin (2011)
28.
Zurück zum Zitat Warren, D.H.: Blindness and Early Childhood Development. American Foundation for the Blind, Arlington (1977) Warren, D.H.: Blindness and Early Childhood Development. American Foundation for the Blind, Arlington (1977)
29.
Zurück zum Zitat Fraiberg, S.: Insights from the Blind: Comparative Studies of Blind and Sighted Infants. Basic Books, New York (1977) Fraiberg, S.: Insights from the Blind: Comparative Studies of Blind and Sighted Infants. Basic Books, New York (1977)
30.
Zurück zum Zitat Kemp, N.J., Rutter, D.R.: Social interaction in blind people: an experimental analysis. Hum. Relat. 39(3), 195–210 (1986) Kemp, N.J., Rutter, D.R.: Social interaction in blind people: an experimental analysis. Hum. Relat. 39(3), 195–210 (1986)
31.
Zurück zum Zitat Krishna, S., Little, G., Black, J., Panchanathan, S.: A wearable face recognition system for individuals with visual impairments. In: Proceedings of the 7th international ACM SIGACCESS conference on Computers and accessibility - Assets’05, pp. 216–217. ACM (2005) Krishna, S., Little, G., Black, J., Panchanathan, S.: A wearable face recognition system for individuals with visual impairments. In: Proceedings of the 7th international ACM SIGACCESS conference on Computers and accessibility - Assets’05, pp. 216–217. ACM (2005)
32.
Zurück zum Zitat Kramer, K. M., Hedin, D. S., Rolkosky, D. J.: Smartphone based face recognition tool for the blind. In: 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC’10, pp. 4538–4541. ACM (2010) Kramer, K. M., Hedin, D. S., Rolkosky, D. J.: Smartphone based face recognition tool for the blind. In: 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC’10, pp. 4538–4541. ACM (2010)
33.
Zurück zum Zitat Krishna, S., Panchanathan, S.: Assistive technologies as effective mediators in interpersonal social interactions for persons with visual disability. In: Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2010, vol. 6180 LNCS, PART 2, pp. 316–323 Krishna, S., Panchanathan, S.: Assistive technologies as effective mediators in interpersonal social interactions for persons with visual disability. In: Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2010, vol. 6180 LNCS, PART 2, pp. 316–323
34.
Zurück zum Zitat Buimer, H. P., Bittner, M., Kostelijk, T., van der Geest, T. M., van Wezel, R. J., Zhao, Y.: Enhancing emotion recognition in vips with haptic feedback. In: International Conference on Human-Computer Interaction, pp. 157–163. Springer, Cham (2016) Buimer, H. P., Bittner, M., Kostelijk, T., van der Geest, T. M., van Wezel, R. J., Zhao, Y.: Enhancing emotion recognition in vips with haptic feedback. In: International Conference on Human-Computer Interaction, pp. 157–163. Springer, Cham (2016)
35.
Zurück zum Zitat McDaniel, T., Bala, S., Rosenthal, J., Tadayon, R., Tadayon, A., Panchanathan, S.: Affective haptics for enhancing access to social interactions for individuals who are blind. In: International Conference on Universal Access in Human-Computer Interaction, pp. 419–429. Springer, Cham (2014) McDaniel, T., Bala, S., Rosenthal, J., Tadayon, R., Tadayon, A., Panchanathan, S.: Affective haptics for enhancing access to social interactions for individuals who are blind. In: International Conference on Universal Access in Human-Computer Interaction, pp. 419–429. Springer, Cham (2014)
36.
Zurück zum Zitat Bala, S., McDaniel, T., Panchanathan, S.: Visual-to-tactile mapping of facial movements for enriched social interactions. In: 2014 IEEE International Symposium on Haptic, Audio and Visual Environments and Games (HAVE) Proceedings, pp. 82–87. IEEE (2014 Bala, S., McDaniel, T., Panchanathan, S.: Visual-to-tactile mapping of facial movements for enriched social interactions. In: 2014 IEEE International Symposium on Haptic, Audio and Visual Environments and Games (HAVE) Proceedings, pp. 82–87. IEEE (2014
37.
Zurück zum Zitat Anam, A. I., Alam, S., Yeasin, M.: Expression: a dyadic conversation aid using Google Glass for people who are blind or visually impaired. In: 6th International Conference on Mobile Computing, Applications and Services, pp. 57–64. IEEE (2014) Anam, A. I., Alam, S., Yeasin, M.: Expression: a dyadic conversation aid using Google Glass for people who are blind or visually impaired. In: 6th International Conference on Mobile Computing, Applications and Services, pp. 57–64. IEEE (2014)
38.
Zurück zum Zitat Tanveer, M. I., Anam, A. S. M., Yeasin, M., Khan, M.: Do you see what I see?: designing a sensory substitution device to access non-verbal modes of communication. In: Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility, p. 8. Article 10 (2013) Tanveer, M. I., Anam, A. S. M., Yeasin, M., Khan, M.: Do you see what I see?: designing a sensory substitution device to access non-verbal modes of communication. In: Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility, p. 8. Article 10 (2013)
39.
Zurück zum Zitat Pentland, A.: Social signal processing exploratory DSP. IEEE Signal Process. Mag. 24(4), 108–111 (2007) Pentland, A.: Social signal processing exploratory DSP. IEEE Signal Process. Mag. 24(4), 108–111 (2007)
40.
Zurück zum Zitat Knapp, M.L., Hall, J.A., Horgan, T.G.: Nonverbal Communication in Human Interaction. Harcourt Brace College Publishers, New York (1972) Knapp, M.L., Hall, J.A., Horgan, T.G.: Nonverbal Communication in Human Interaction. Harcourt Brace College Publishers, New York (1972)
41.
Zurück zum Zitat Richmond, V.P., McCroskey, J.C., Payne, S.K.: Nonverbal Behavior in Interpersonal Relations. Prentice Hall, Englewood Cliffs (1991) Richmond, V.P., McCroskey, J.C., Payne, S.K.: Nonverbal Behavior in Interpersonal Relations. Prentice Hall, Englewood Cliffs (1991)
42.
Zurück zum Zitat Ambady, N., Rosenthal, R.: Thin slices of expressive behavior as predictors of interpersonal consequences: a meta-analysis. Psychol. Bull. 111(2), 256–274 (1992) Ambady, N., Rosenthal, R.: Thin slices of expressive behavior as predictors of interpersonal consequences: a meta-analysis. Psychol. Bull. 111(2), 256–274 (1992)
43.
Zurück zum Zitat Coulson, M.: Attributing emotion to static body postures: recognition accuracy, confusions, and viewpoint dependence. J. Nonverbal Behav. 28(2), 117–139 (2004)MathSciNet Coulson, M.: Attributing emotion to static body postures: recognition accuracy, confusions, and viewpoint dependence. J. Nonverbal Behav. 28(2), 117–139 (2004)MathSciNet
44.
Zurück zum Zitat Van den Stock, J., Righart, R., De Gelder, B.: Body expressions influence recognition of emotions in the face and voice. Emotion 7(3), 487–494 (2007) Van den Stock, J., Righart, R., De Gelder, B.: Body expressions influence recognition of emotions in the face and voice. Emotion 7(3), 487–494 (2007)
45.
Zurück zum Zitat Darwin, C.: 1965. The Expression of the Emotions in Man and Animals. John Marry, London (1872) Darwin, C.: 1965. The Expression of the Emotions in Man and Animals. John Marry, London (1872)
46.
Zurück zum Zitat Keltner, D., Ekman, P., Gonzaga, G.C., Beer, J.: Facial Expression of Emotion. Guilford Publications, New York (2000) Keltner, D., Ekman, P., Gonzaga, G.C., Beer, J.: Facial Expression of Emotion. Guilford Publications, New York (2000)
47.
Zurück zum Zitat Kleinke, C.L.: Gaze and eye contact. a research review. Psychol. Bull. 100(1), 78–100 (1986) Kleinke, C.L.: Gaze and eye contact. a research review. Psychol. Bull. 100(1), 78–100 (1986)
48.
Zurück zum Zitat Scherer, K.R.: Vocal communication of emotion: a review of research paradigms. Speech Commun. 40(1–2), 227–256 (2003)MATH Scherer, K.R.: Vocal communication of emotion: a review of research paradigms. Speech Commun. 40(1–2), 227–256 (2003)MATH
49.
Zurück zum Zitat Hall, E.T.: The Silent Language, vol. 3. Doubleday, New York (1959) Hall, E.T.: The Silent Language, vol. 3. Doubleday, New York (1959)
50.
Zurück zum Zitat Lott, D.F., Sommer, R.: Seating arrangements and status. J. Pers. Soc. Psychol. 7(1, Pt.1), 90–95 (1967) Lott, D.F., Sommer, R.: Seating arrangements and status. J. Pers. Soc. Psychol. 7(1, Pt.1), 90–95 (1967)
51.
Zurück zum Zitat Dion, K., Berscheid, E., Walster, E.: What is beautiful is good. J. Pers. Soc. Psychol. 24(3), 285–290 (1972) Dion, K., Berscheid, E., Walster, E.: What is beautiful is good. J. Pers. Soc. Psychol. 24(3), 285–290 (1972)
52.
Zurück zum Zitat Ivonin, L., Chang, H.-M., Diaz, M., Catala, A., Chen, W., Rauterberg, M.: Traces of unconscious mental processes in introspective reports and physiological responses. PLoS ONE 10(4), e0124519 (2015) Ivonin, L., Chang, H.-M., Diaz, M., Catala, A., Chen, W., Rauterberg, M.: Traces of unconscious mental processes in introspective reports and physiological responses. PLoS ONE 10(4), e0124519 (2015)
54.
Zurück zum Zitat Rosengren, K. E.: Advances in Scandinavia content analysis: an introduction. Adv. Content Anal. 9–19 (1981) Rosengren, K. E.: Advances in Scandinavia content analysis: an introduction. Adv. Content Anal. 9–19 (1981)
55.
Zurück zum Zitat Nandy, B.R., Sarvela, P.D.: Content analysis reexamined: a relevant research method for health education. Am. J. Health Behav. 21(3), 222–234 (1997) Nandy, B.R., Sarvela, P.D.: Content analysis reexamined: a relevant research method for health education. Am. J. Health Behav. 21(3), 222–234 (1997)
56.
Zurück zum Zitat An, P., Bakker, S., Eggen, B.: Understanding teachers’ routines to inform classroom technology design. Educ. Inf. Technol. 22(4), 1347–1376 (2017) An, P., Bakker, S., Eggen, B.: Understanding teachers’ routines to inform classroom technology design. Educ. Inf. Technol. 22(4), 1347–1376 (2017)
57.
Zurück zum Zitat Bakker, S., van den Hoven, E., Eggen, B.: Knowing by ear: leveraging human attention abilities in interaction design. J. Multimodal User Interfaces 5(3–4), 197–209 (2012) Bakker, S., van den Hoven, E., Eggen, B.: Knowing by ear: leveraging human attention abilities in interaction design. J. Multimodal User Interfaces 5(3–4), 197–209 (2012)
58.
Zurück zum Zitat Darwin, C., Prodger, P.: The Expression of the Emotions in Man and Animals. Oxford University Press, Oxford (1998) Darwin, C., Prodger, P.: The Expression of the Emotions in Man and Animals. Oxford University Press, Oxford (1998)
59.
Zurück zum Zitat Argyle, M.: The Psychology of Interpersonal Behaviour. Penguin, London (1994) Argyle, M.: The Psychology of Interpersonal Behaviour. Penguin, London (1994)
60.
Zurück zum Zitat Théoret, H., Merabet, L., Pascual-Leone, A.: Behavioral and neuroplastic changes in the blind: evidence for functionally relevant cross-modal interactions. J. Physiol. Paris 98(1), 221–233 (2004) Théoret, H., Merabet, L., Pascual-Leone, A.: Behavioral and neuroplastic changes in the blind: evidence for functionally relevant cross-modal interactions. J. Physiol. Paris 98(1), 221–233 (2004)
61.
Zurück zum Zitat Ivanchenko, V., Coughlan, J., Shen, H.: Crosswatch: a camera phone system for orienting visually impaired pedestrians at traffic intersections. In: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), pp. 1122–1128. LNCS, 5015 (2008) Ivanchenko, V., Coughlan, J., Shen, H.: Crosswatch: a camera phone system for orienting visually impaired pedestrians at traffic intersections. In: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), pp. 1122–1128. LNCS, 5015 (2008)
62.
Zurück zum Zitat Dunai, L., Fajarnes, G. P., Praderas, V. S., Garcia, B. D., Lengua, I. L.: Real-time assistance prototype—a new navigation aid for blind people. In: IECON 2010-36th Annual Conference on IEEE Industrial Electronics Society, pp. 1173–1178. IEEE (2010) Dunai, L., Fajarnes, G. P., Praderas, V. S., Garcia, B. D., Lengua, I. L.: Real-time assistance prototype—a new navigation aid for blind people. In: IECON 2010-36th Annual Conference on IEEE Industrial Electronics Society, pp. 1173–1178. IEEE (2010)
63.
Zurück zum Zitat Ashmead, D.H., Hill, E.W., Talor, C.R.: Obstacle perception by congenitally blind children. Atten. Percept. Psychophys. 46(5), 425–433 (1989) Ashmead, D.H., Hill, E.W., Talor, C.R.: Obstacle perception by congenitally blind children. Atten. Percept. Psychophys. 46(5), 425–433 (1989)
64.
Zurück zum Zitat Ahmed, T., Hoyle, R., Connelly, K., Crandall, D., Kapadia, A.: Privacy concerns and behaviors of people with visual impairments. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp. 3523–3532. ACM (2015) Ahmed, T., Hoyle, R., Connelly, K., Crandall, D., Kapadia, A.: Privacy concerns and behaviors of people with visual impairments. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp. 3523–3532. ACM (2015)
65.
Zurück zum Zitat Gruebler, A., Suzuki, K.: Design of a wearable device for reading positive expressions from facial emg signals. IEEE Trans. Affect. Comput. 5(3), 227–237 (2014) Gruebler, A., Suzuki, K.: Design of a wearable device for reading positive expressions from facial emg signals. IEEE Trans. Affect. Comput. 5(3), 227–237 (2014)
66.
Zurück zum Zitat Qiu, S., Rauterberg, M., Hu, J.: Designing and evaluating a wearable device for accessing gaze signals from the sighted. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 9737, 454–464 (2016) Qiu, S., Rauterberg, M., Hu, J.: Designing and evaluating a wearable device for accessing gaze signals from the sighted. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 9737, 454–464 (2016)
67.
Zurück zum Zitat Qiu, S., Anas, S. A., Osawa, H., Rauterberg, M., Hu, J.: E-gaze glasses: simulating natural gazes for blind people. In: Proceedings of the TEI’16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction, pp. 563–569. ACM (2016) Qiu, S., Anas, S. A., Osawa, H., Rauterberg, M., Hu, J.: E-gaze glasses: simulating natural gazes for blind people. In: Proceedings of the TEI’16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction, pp. 563–569. ACM (2016)
68.
Zurück zum Zitat Bond, M.H., Goodman, G.N.: Gaze patterns and interaction contexts: effects on personality impressions and attributions. Psychol. Int. J. Psychol., Orient (1980) Bond, M.H., Goodman, G.N.: Gaze patterns and interaction contexts: effects on personality impressions and attributions. Psychol. Int. J. Psychol., Orient (1980)
69.
Zurück zum Zitat Argyle, M., Henderson, M., Bond, M., Iizuka, Y., Contarello, A.: Cross-cultural variations in relationship rules. Int. J. Psychol. 21(1–4), 287–315 (1986) Argyle, M., Henderson, M., Bond, M., Iizuka, Y., Contarello, A.: Cross-cultural variations in relationship rules. Int. J. Psychol. 21(1–4), 287–315 (1986)
70.
Zurück zum Zitat Senju, A., Vernetti, A., Kikuchi, Y., Akechi, H., Hasegawa, T., Johnson, M.H.: Cultural background modulates how we look at other persons’ gaze. Int. J. Behav. Dev. 37(2), 131–136 (2013) Senju, A., Vernetti, A., Kikuchi, Y., Akechi, H., Hasegawa, T., Johnson, M.H.: Cultural background modulates how we look at other persons’ gaze. Int. J. Behav. Dev. 37(2), 131–136 (2013)
71.
Zurück zum Zitat Utsumi, A., Kawato, S., Abe, S.: Attention monitoring based on temporal signal-behavior structures. In: International Workshop on Human-Computer Interaction, pp. 100–109. Springer, Berlin (2005) Utsumi, A., Kawato, S., Abe, S.: Attention monitoring based on temporal signal-behavior structures. In: International Workshop on Human-Computer Interaction, pp. 100–109. Springer, Berlin (2005)
72.
Zurück zum Zitat Murphy-Chutorian, E., Trivedi, M.M.: Head pose estimation in computer vision: a survey. IEEE Trans. Pattern Anal. Mach. Intell. 31(4), 607–626 (2009) Murphy-Chutorian, E., Trivedi, M.M.: Head pose estimation in computer vision: a survey. IEEE Trans. Pattern Anal. Mach. Intell. 31(4), 607–626 (2009)
73.
Zurück zum Zitat Ruffieux, S., Ruffieux, N., Caldara, R., Lalanne, D.: iKnowU–exploring the potential of multimodal ar smart glasses for the decoding and rehabilitation of face processing in clinical populations. In: IFIP Conference on Human-Computer Interaction, pp. 423–432. Springer, Cham (2017) Ruffieux, S., Ruffieux, N., Caldara, R., Lalanne, D.: iKnowU–exploring the potential of multimodal ar smart glasses for the decoding and rehabilitation of face processing in clinical populations. In: IFIP Conference on Human-Computer Interaction, pp. 423–432. Springer, Cham (2017)
74.
Zurück zum Zitat Sandnes, F. E.: What do low-vision users really want from smart glasses? Faces, text and perhaps no glasses at all. In: International Conference on Computers Helping People with Special Needs, pp. 187–194. Springer, Cham (2016) Sandnes, F. E.: What do low-vision users really want from smart glasses? Faces, text and perhaps no glasses at all. In: International Conference on Computers Helping People with Special Needs, pp. 187–194. Springer, Cham (2016)
75.
Zurück zum Zitat Sandnes, F. E., Eika, E.: Head-mounted augmented reality displays on the cheap: a DIY approach to sketching and prototyping low-vision assistive technologies. In: International Conference on Universal Access in Human-Computer Interaction, pp. 167–186. Springer, Cham (2017) Sandnes, F. E., Eika, E.: Head-mounted augmented reality displays on the cheap: a DIY approach to sketching and prototyping low-vision assistive technologies. In: International Conference on Universal Access in Human-Computer Interaction, pp. 167–186. Springer, Cham (2017)
Metadaten
Titel
Understanding visually impaired people’s experiences of social signal perception in face-to-face communication
verfasst von
Shi Qiu
Pengcheng An
Jun Hu
Ting Han
Matthias Rauterberg
Publikationsdatum
04.11.2019
Verlag
Springer Berlin Heidelberg
Erschienen in
Universal Access in the Information Society / Ausgabe 4/2020
Print ISSN: 1615-5289
Elektronische ISSN: 1615-5297
DOI
https://doi.org/10.1007/s10209-019-00698-3

Weitere Artikel der Ausgabe 4/2020

Universal Access in the Information Society 4/2020 Zur Ausgabe