2013 | OriginalPaper | Buchkapitel
Situated Multiparty Interaction between Humans and Agents
verfasst von : Aasish Pappu, Ming Sun, Seshadri Sridharan, Alex Rudnicky
Erschienen in: Human-Computer Interaction. Interaction Modalities and Techniques
Verlag: Springer Berlin Heidelberg
Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.
Wählen Sie Textabschnitte aus um mit Künstlicher Intelligenz passenden Patente zu finden. powered by
Markieren Sie Textabschnitte, um KI-gestützt weitere passende Inhalte zu finden. powered by
A social agent such as a receptionist or an escort robot encounters challenges when communicating with people in open areas. The agent must know not to react to distracting acoustic and visual events and it needs to appropriately handle situations that include multiple humans, being able to to focus on active interlocutors and appropriately shift attention based on the context. We describe a multiparty interaction agent that helps multiple users arrange a common activity. From the user study we conducted, we found that the agent can discriminate between active and inactive interlocutors well by using the skeletal and azimuth information. Participants found the addressee much clearer when an animated talking head was used.