2009 | OriginalPaper | Buchkapitel
Automated Generation of Emotive Virtual Humans
verfasst von : Joon Hao Chuah, Brent Rossen, Benjamin Lok
Erschienen in: Intelligent Virtual Agents
Verlag: Springer Berlin Heidelberg
Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.
Wählen Sie Textabschnitte aus um mit Künstlicher Intelligenz passenden Patente zu finden. powered by
Markieren Sie Textabschnitte, um KI-gestützt weitere passende Inhalte zu finden. powered by
Emotive virtual humans (VHs) are important for affective interactions with embodied conversation agents [1]. However, emotive VHs require significant resources and time. As an example, the VHs in movies and video games require teams of animators and months of work. VHs can also be imbued with emotion using appraisal theory methods that use psychology based models to generate emotions by using the VH’s goals and beliefs to evaluate external events. These external events require manual tagging or natural language understanding [2]. As an alternative approach, we propose tagging VH responses with emotions using textual affect sensing methods. The method developed by Neviarouskaya et al. [3] uses syntactic parses and a database of words and associated emotion intensities.We use this database, and because these emotions are associated with specific words, we can combine the emotions with audio timing information to generate lip-synched facial expressions. Our approach, AutoEmotion, allows us to automatically add basic emotions to VHs without the need for manual animation or tagging or natural language understanding.