Skip to main content
Erschienen in: Cognitive Computation 3/2011

01.09.2011

Movements and Holds in Fluent Sentence Production of American Sign Language: The Action-Based Approach

verfasst von: Bernd J. Kröger, Peter Birkholz, Jim Kannampuzha, Emily Kaufmann, Irene Mittelberg

Erschienen in: Cognitive Computation | Ausgabe 3/2011

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

The importance of bodily movements in the production and perception of communicative actions has been shown for the spoken language modality and accounted for by a theory of communicative actions (Cogn. Process. 2010;11:187–205). In this study, the theory of communicative actions was adapted to the sign language modality; we tested the hypothesis that in the fluent production of short sign language sentences, strong-hand manual sign actions are continuously ongoing without holds, while co-manual oral expression actions (i.e. sign-related actions of the lips, jaw, and tip of the tongue) and co-manual facial expression actions (i.e. actions of the eyebrows, eyelids, etc.), as well as weak-hand actions, show considerable holds. An American Sign Language (ASL) corpus of 100 sentences was analyzed by visually inspecting each frame-to-frame difference (30 frames/s) for separating movement and hold phases for each manual, oral, and facial action. Excluding fingerspelling and signs in sentence-final position, no manual holds were found for the strong hand (0%; the weak hand is not considered), while oral holds occurred in 22% of all oral expression actions and facial holds occurred for all facial expression actions analyzed (100%). These results support the idea that in each language modality, the dominant articulatory system (vocal tract or manual system) determines the timing of actions. In signed languages, in which manual actions are dominant, holds occur mainly in co-manual oral and co-manual facial actions. Conversely, in spoken language, vocal tract actions (i.e. actions of the lips, tongue, jaw, velum, and vocal folds) are dominant; holds occur primarily in co-verbal manual and co-verbal facial actions.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Ambadar Z, Schooler J, Cohn JF. Deciphering the enigmatic face: the importance of facial dynamics to interpreting subtle facial expressions. Psychol Sci. 2005;16:403–10.PubMedCrossRef Ambadar Z, Schooler J, Cohn JF. Deciphering the enigmatic face: the importance of facial dynamics to interpreting subtle facial expressions. Psychol Sci. 2005;16:403–10.PubMedCrossRef
2.
Zurück zum Zitat Bauer D, Kannampuzha J, Kröger BJ. Articulatory speech re-synthesis: profiting from natural acoustic speech data. In: Esposito A, Vich R, editors. Cross-modal analysis of speech, gestures, gaze and facial expressions, LNAI 5641. Berlin: Springer; 2009. p. 344–55.CrossRef Bauer D, Kannampuzha J, Kröger BJ. Articulatory speech re-synthesis: profiting from natural acoustic speech data. In: Esposito A, Vich R, editors. Cross-modal analysis of speech, gestures, gaze and facial expressions, LNAI 5641. Berlin: Springer; 2009. p. 344–55.CrossRef
4.
Zurück zum Zitat Browman C, Goldstein L. Articulatory gestures as phonological units. Phonology. 1989;6:201–51.CrossRef Browman C, Goldstein L. Articulatory gestures as phonological units. Phonology. 1989;6:201–51.CrossRef
5.
6.
Zurück zum Zitat Cohn JF. Foundations of human computing: facial expression and emotion. In: Huang TS, Nijholt A, Pantic M, Pentland A, editors. Artifical intelligence for human computing (LNAI 4451). Berlin: Springer; 2007. p. 1–16.CrossRef Cohn JF. Foundations of human computing: facial expression and emotion. In: Huang TS, Nijholt A, Pantic M, Pentland A, editors. Artifical intelligence for human computing (LNAI 4451). Berlin: Springer; 2007. p. 1–16.CrossRef
7.
Zurück zum Zitat Cohn JF, Ambadar Z, Ekman P. Observer-based measurement of facial expression with the facial action coding system. In: Coan JA, Allen JJB, editors. Handbook of emotion elicitation and assessment. New York: Oxford University Press; 2007. p. 203–21. Cohn JF, Ambadar Z, Ekman P. Observer-based measurement of facial expression with the facial action coding system. In: Coan JA, Allen JJB, editors. Handbook of emotion elicitation and assessment. New York: Oxford University Press; 2007. p. 203–21.
8.
Zurück zum Zitat Dreuw P, Rybach D, Deselaers T, Zahedi M, Ney H. Speech Recognition Techniques for a Sign Language Recognition System. 2007. Proceedings of Interspeech 2007 (Antwerp, Belgium). pp. 2513–2516. Dreuw P, Rybach D, Deselaers T, Zahedi M, Ney H. Speech Recognition Techniques for a Sign Language Recognition System. 2007. Proceedings of Interspeech 2007 (Antwerp, Belgium). pp. 2513–2516.
9.
Zurück zum Zitat Ekman P, Friesen WV. Measuring facial movement. Environ Psychol Nonverbal Behavior. 1976;1:56–75.CrossRef Ekman P, Friesen WV. Measuring facial movement. Environ Psychol Nonverbal Behavior. 1976;1:56–75.CrossRef
10.
Zurück zum Zitat Ekman P, Friesen WV. Facial action coding system. Palo Alto, CA: Consulting Psychologists Press; 1978. Ekman P, Friesen WV. Facial action coding system. Palo Alto, CA: Consulting Psychologists Press; 1978.
11.
Zurück zum Zitat Emmorey K. Language, cognition, and the brain: insights from sign language research. Lawrence Erlbaum Associates; 2002. Emmorey K. Language, cognition, and the brain: insights from sign language research. Lawrence Erlbaum Associates; 2002.
12.
Zurück zum Zitat Fontana S. Mouth actions as gesture in sign language. Gesture. 2008;8:104–23.CrossRef Fontana S. Mouth actions as gesture in sign language. Gesture. 2008;8:104–23.CrossRef
13.
Zurück zum Zitat Goldin-Meadow S. Hearing gesture. Cambridge, London: Belknap & Harvard University Press; 2003. Goldin-Meadow S. Hearing gesture. Cambridge, London: Belknap & Harvard University Press; 2003.
14.
Zurück zum Zitat Goldstein L, Byrd D, Saltzman E. The role of vocal tract action units in understanding the evolution of phonology. In: Arbib MA, editor. Action to language via the mirror neuron system. Cambridge: Cambridge University Press; 2006. p. 215–49.CrossRef Goldstein L, Byrd D, Saltzman E. The role of vocal tract action units in understanding the evolution of phonology. In: Arbib MA, editor. Action to language via the mirror neuron system. Cambridge: Cambridge University Press; 2006. p. 215–49.CrossRef
15.
Zurück zum Zitat Goldstein L, Pouplier M, Chen L, Saltzman L, Byrd D. Dynamic action units slip in speech production errors. Cognition. 2007;103:386–412.PubMedCrossRef Goldstein L, Pouplier M, Chen L, Saltzman L, Byrd D. Dynamic action units slip in speech production errors. Cognition. 2007;103:386–412.PubMedCrossRef
16.
Zurück zum Zitat Kendon A. Language and gesture: unity or duality? In: McNeill D, editor. Language and gesture. Cambridge: Cambridge University Press; 2000. p. 47–63.CrossRef Kendon A. Language and gesture: unity or duality? In: McNeill D, editor. Language and gesture. Cambridge: Cambridge University Press; 2000. p. 47–63.CrossRef
17.
Zurück zum Zitat Kendon A. Gesture: visible action as utterance. New York: Cambridge University Press; 2004. Kendon A. Gesture: visible action as utterance. New York: Cambridge University Press; 2004.
18.
Zurück zum Zitat Klima E, Bellugi U. The signs of language. Cambridge, MA: Harvard University Press; 1979. Klima E, Bellugi U. The signs of language. Cambridge, MA: Harvard University Press; 1979.
19.
Zurück zum Zitat Kopp S, Wachsmuth I. Synthesizing multimodal utterances for conversational agents. J Comput Animat Virtual Worlds. 2004;15:39–51.CrossRef Kopp S, Wachsmuth I. Synthesizing multimodal utterances for conversational agents. J Comput Animat Virtual Worlds. 2004;15:39–51.CrossRef
20.
Zurück zum Zitat Kröger BJ, Birkholz P. A gesture-based concept for speech movement control in articulatory speech synthesis. In: Esposito A, Faundez-Zanuy M, Keller E, Marinaro M, editors. Verbal and nonverbal communication behaviours, LNAI 4775. Berlin: Springer; 2007. p. 174–89.CrossRef Kröger BJ, Birkholz P. A gesture-based concept for speech movement control in articulatory speech synthesis. In: Esposito A, Faundez-Zanuy M, Keller E, Marinaro M, editors. Verbal and nonverbal communication behaviours, LNAI 4775. Berlin: Springer; 2007. p. 174–89.CrossRef
21.
Zurück zum Zitat Kröger BJ, Birkholz P. Articulatory Synthesis of Speech and Singing: State of the Art and Suggestions for Future Research. In: Esposito A, Hussain A, Marinaro M, editors. Multimodal signals: cognitive and algorithmic issues. LNAI 5398. Berlin: Springer; 2009. p. 306–19.CrossRef Kröger BJ, Birkholz P. Articulatory Synthesis of Speech and Singing: State of the Art and Suggestions for Future Research. In: Esposito A, Hussain A, Marinaro M, editors. Multimodal signals: cognitive and algorithmic issues. LNAI 5398. Berlin: Springer; 2009. p. 306–19.CrossRef
22.
Zurück zum Zitat Kröger BJ, Kannampuzha J, Neuschaefer-Rube C. Towards a neurocomputational model of speech production and perception. Speech Commun. 2009;51:793–809.CrossRef Kröger BJ, Kannampuzha J, Neuschaefer-Rube C. Towards a neurocomputational model of speech production and perception. Speech Commun. 2009;51:793–809.CrossRef
23.
Zurück zum Zitat Kröger BJ, Kopp S, Lowit A. A model for production, perception, and acquisition of actions in face-to-face communication. Cogn Process. 2010;11:187–205. Kröger BJ, Kopp S, Lowit A. A model for production, perception, and acquisition of actions in face-to-face communication. Cogn Process. 2010;11:187–205.
24.
Zurück zum Zitat Lausberg H, Sloetjes H. Coding gestural behavior with the NEUROGES-ELAN system. Behav Res Meth. 2009;41(3):841–9.CrossRef Lausberg H, Sloetjes H. Coding gestural behavior with the NEUROGES-ELAN system. Behav Res Meth. 2009;41(3):841–9.CrossRef
25.
Zurück zum Zitat Liberman AM, Mattingly IG. The motor theory of speech perception revised. Cognition. 1985;21:1–36.PubMedCrossRef Liberman AM, Mattingly IG. The motor theory of speech perception revised. Cognition. 1985;21:1–36.PubMedCrossRef
26.
Zurück zum Zitat Liddell SK, Johnson RE. American sign language: the phonological base. Sign Lang Stud. 1989;64:195–277. Liddell SK, Johnson RE. American sign language: the phonological base. Sign Lang Stud. 1989;64:195–277.
27.
Zurück zum Zitat Liddell SK, Metzger M. Gesture in sign language discourse. J Pragmat. 1998;30:657–97.CrossRef Liddell SK, Metzger M. Gesture in sign language discourse. J Pragmat. 1998;30:657–97.CrossRef
28.
Zurück zum Zitat Liddell SK. Grammar, gesture and meaning in American sign language. New York: Cambridge University Press; 2003. Liddell SK. Grammar, gesture and meaning in American sign language. New York: Cambridge University Press; 2003.
29.
Zurück zum Zitat McNeill D. Hand and mind: what gestures reveal about thought. Chicago: University of Chicago Press; 1992. McNeill D. Hand and mind: what gestures reveal about thought. Chicago: University of Chicago Press; 1992.
30.
Zurück zum Zitat McNeill D. Gesture and thought. Chicago: University of Chicago Press; 2005. McNeill D. Gesture and thought. Chicago: University of Chicago Press; 2005.
31.
Zurück zum Zitat McNeill D, Quek F, McCullough K-E, Duncan SD, Furuyama N, Bryll R, Ansari R. Catchments, prosody and discourse. Gesture. 2001;1(1):9–33.CrossRef McNeill D, Quek F, McCullough K-E, Duncan SD, Furuyama N, Bryll R, Ansari R. Catchments, prosody and discourse. Gesture. 2001;1(1):9–33.CrossRef
32.
Zurück zum Zitat Perlmutter DM. Sonority and syllable structure in American sign language. Linguist Inquiry. 1992;23:407–42. Perlmutter DM. Sonority and syllable structure in American sign language. Linguist Inquiry. 1992;23:407–42.
33.
Zurück zum Zitat Saltzman E, Byrd D. Task-dynamics of gestural timing: Phase windows and multifrequency rhythms. Hum Mov Sci. 2000;19:499–526.CrossRef Saltzman E, Byrd D. Task-dynamics of gestural timing: Phase windows and multifrequency rhythms. Hum Mov Sci. 2000;19:499–526.CrossRef
34.
Zurück zum Zitat Sandler W. Symbolic symbolization by hand and mouth in sign language. Semiotica. 2009;174:241–75.CrossRef Sandler W. Symbolic symbolization by hand and mouth in sign language. Semiotica. 2009;174:241–75.CrossRef
35.
Zurück zum Zitat Schmidt KL, Ambadar Z, Cohn JF, Reed LI. Movement differences between deliberate and spontaneous facial expressions: zygomaticus major action in smiling. J Nonverbal Behav. 2006;30:37–52.PubMedCrossRef Schmidt KL, Ambadar Z, Cohn JF, Reed LI. Movement differences between deliberate and spontaneous facial expressions: zygomaticus major action in smiling. J Nonverbal Behav. 2006;30:37–52.PubMedCrossRef
36.
Zurück zum Zitat Schmidt KL, Bhattacharya S, Denlinger R. Comparison of deliberate and spontaneous facial movement in smiles and eyebrow raises. J Nonverbal Behav. 2009;33:35–45.PubMedCrossRef Schmidt KL, Bhattacharya S, Denlinger R. Comparison of deliberate and spontaneous facial movement in smiles and eyebrow raises. J Nonverbal Behav. 2009;33:35–45.PubMedCrossRef
37.
Zurück zum Zitat Schmidt KL, Cohn JF, Tian Y. Signal characteristics of spontaneous facial expressions: automatic movement in solitary and social smiles. Biol Psychol. 2003;65:49–66.PubMedCrossRef Schmidt KL, Cohn JF, Tian Y. Signal characteristics of spontaneous facial expressions: automatic movement in solitary and social smiles. Biol Psychol. 2003;65:49–66.PubMedCrossRef
38.
Zurück zum Zitat Stokoe WC (1960) Sign language structure: An outline of the visual communication systems of the American Deaf, Studies in Linguistics Occasional Paper 8, University of Buffalo. Stokoe WC (1960) Sign language structure: An outline of the visual communication systems of the American Deaf, Studies in Linguistics Occasional Paper 8, University of Buffalo.
39.
Zurück zum Zitat Tian YL, Kanade T, Cohn JF. Facial expression analysis. In: Li SZ, Jain AK, editors. Handbook of face recognition. New York: Springer; 2005. p. 247–75.CrossRef Tian YL, Kanade T, Cohn JF. Facial expression analysis. In: Li SZ, Jain AK, editors. Handbook of face recognition. New York: Springer; 2005. p. 247–75.CrossRef
40.
Zurück zum Zitat Valli C, Lucas C. Linguistics of American sign language. An Introduction. Washington: Gallaudet University Press; 2000. Valli C, Lucas C. Linguistics of American sign language. An Introduction. Washington: Gallaudet University Press; 2000.
42.
Zurück zum Zitat Wilcox S, Morford JP. Empirical methods in signed language research. In: Gonzalez-Marquez M, Mittelberg I, Coulson S, Spivey MJ, editors. Methods in cognitive linguistics. Amsterdam/Philadelphia: John Benjamins; 2007. p. 171–200. Wilcox S, Morford JP. Empirical methods in signed language research. In: Gonzalez-Marquez M, Mittelberg I, Coulson S, Spivey MJ, editors. Methods in cognitive linguistics. Amsterdam/Philadelphia: John Benjamins; 2007. p. 171–200.
Metadaten
Titel
Movements and Holds in Fluent Sentence Production of American Sign Language: The Action-Based Approach
verfasst von
Bernd J. Kröger
Peter Birkholz
Jim Kannampuzha
Emily Kaufmann
Irene Mittelberg
Publikationsdatum
01.09.2011
Verlag
Springer-Verlag
Erschienen in
Cognitive Computation / Ausgabe 3/2011
Print ISSN: 1866-9956
Elektronische ISSN: 1866-9964
DOI
https://doi.org/10.1007/s12559-010-9071-2

Weitere Artikel der Ausgabe 3/2011

Cognitive Computation 3/2011 Zur Ausgabe