ABSTRACT
We introduce an experimental setting to observe and measure the perception of facial expression performed by embodied conversational agents (ECAs). The experimental set-up enables to measure the implications of embodied conversational agents invarious contextual settings. We developed a matrix, distinguishing three types of stimuli dimensions (simultaneous, sequential, adaptive) and three contextual configurations. The aim of the study is to establish an experimental framework to analyze subjects' emotional status in different context dimensions. The experimental setting helps to close a gap in the intersection of affective computing and embodied conversational agents.
- Eckschlager, M., Lankes, M., Bernhaupt, R. Real or Unreal? An Evaluation Setting for Emotional Characters Using Unreal Technology. In Proceedings of ACE '05, Valencia, Spain, 2005. Google ScholarDigital Library
- Ekman, P., Friesen, W.V., Ellworth, P. Emotion in the human face. Pergamon, New York, 1972.Google Scholar
- Ekman, P., Rosenberg, E.L. What the face reveals: Basic and applied studies of spontaneous expression using the Facial Action Coding System (FACS). Oxford University Press, New York, 2005.Google Scholar
- Fischer, L., Brauns, D., Belschak, F. Zur Messung von Emotionen in der angewandten Forschung: Analysen mit den SAMs-Self Assessment Manikin. Pabst Science Publishers, Lengerich, Berlin, Riga, Rom, Wien, Zagreb, 2002.Google Scholar
- Höök, K. Evaluation of Affective Interfaces. http source: www.sics.se/safira/publications/P-AAMAS-Hook.pdf, 2002.Google Scholar
- Isbister, K., Doyle, P. Design ans Evaluation of Embodied Conversational Agents: A Proposed Taxonomy. In Proceedings of AAMAS '02, Bologna, Italy, 2002.Google Scholar
- Mancini, M., Hartmann, B., Pelachaud, C. Non-verbal behaviors expressivity and their representation. http source: http://pfstar.itc.it/public/doc/deliverables/pelachaud_tech_rep3.pdf, 2004.Google Scholar
- Minge, M. Methoden zur Erhebung emotionaler Aspekte bei der Interaktion mit technischen Systemen. Diploma Thesis, Berlin, 2005.Google Scholar
- Mulken, S., Dehn, D. M. The impact of animated interface agents: a review of empirical research. Int. J. Human Computer Studies, 2000. Google ScholarDigital Library
- Picard, R. W. Affective Computing. MIT Press, Cambridge, MA, 1997. Google ScholarDigital Library
- Reeves, B. and Nass, C. The media equitation: How people treat computers, television and new media like real people and places. CSLI Publications, Stanford, CA, 1996. Google ScholarDigital Library
- Russell, J. A., Fernandez-Dols, J. .M. The Psychology of Facial Expression. Cambridge University Press, Cambridge, 1997.Google ScholarCross Ref
- Shiano, D. J., Ehrlich, S. M., Sheridan, K. Categorical Imperative Not: Facial Affect is Percieved Continuously. In Proceedings of CHI '04, Vienna, Austria, 2004. Google ScholarDigital Library
- Wallbott, H.G. Mimik im Kontext: Die Bedeutung verschiedener Informationskomponenten für das Erkennen von Emotionen. Verlag für Psychologie Dr. C. J. Hogrefe, Göttingen, 1990.Google Scholar
- Website of Ars Box System: http://futurelab.aec.at/arsbox/Google Scholar
- Website of Ken Perlin: http://mrl.nyu.edu/~perlin/experiments/facedemo/Google Scholar
- Website of Face Poser/Valve: http://developer.valvesoftware.com/wiki/Face_poserGoogle Scholar
Index Terms
- An experimental setting to measure contextual perception of embodied conversational agents
Recommendations
Extending chatterbot system into multimodal interaction framework with embodied contextual understanding
HRI '12: Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot InteractionThis work aims to realize multimodal interaction with embodied contextual understanding based on the simple chatterbot system. A system framework is proposed to integrate the dialogue system into a 3D simulation platform, SIGVerse to attain multimodal ...
Embodied conversational agents: computing and rendering realistic gaze patterns
PCM'06: Proceedings of the 7th Pacific Rim conference on Advances in Multimedia Information ProcessingWe describe here our efforts for modeling multimodal signals exchanged by interlocutors when interacting face-to-face. This data is then used to control embodied conversational agents able to engage into a realistic face-to-face interaction with human ...
Multimodal backchannels for embodied conversational agents
IVA'10: Proceedings of the 10th international conference on Intelligent virtual agentsOne of the most desirable characteristics of an Embodied Conversational Agent (ECA) is the capability of interacting with users in a human-like manner. While listening to a user, an ECA should be able to provide backchannel signals through visual and ...
Comments