2005 | OriginalPaper | Buchkapitel
An Affective User Interface Based on Facial Expression Recognition and Eye-Gaze Tracking
verfasst von : Soo-Mi Choi, Yong-Guk Kim
Erschienen in: Affective Computing and Intelligent Interaction
Verlag: Springer Berlin Heidelberg
Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.
Wählen Sie Textabschnitte aus um mit Künstlicher Intelligenz passenden Patente zu finden. powered by
Markieren Sie Textabschnitte, um KI-gestützt weitere passende Inhalte zu finden. powered by
This paper describes a pipeline by which facial expression and eye-gaze of the user are tracked, and then 3D facial animation is synthesized in the remote place based upon timing information of the facial and eye movement information. The system first detects a facial area within the given image and then classifies its facial expression into 7 emotional weightings. Such weighting information, transmitted to the PDA via a mobile network, is used for non-photorealistic facial expression animation. It turns out that facial expression animation using emotional curves is more effective in expressing the timing of an expression comparing to the linear interpolation method. The emotional avatar embedded on a mobile platform has some potential in conveying emotion between peoples via Internet.