2009 | OriginalPaper | Buchkapitel
Integrating Graph-Based Vision Perception to Spoken Conversation in Human-Robot Interaction
verfasst von : Wendy Aguilar, Luis A. Pineda
Erschienen in: Bio-Inspired Systems: Computational and Ambient Intelligence
Verlag: Springer Berlin Heidelberg
Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.
Wählen Sie Textabschnitte aus um mit Künstlicher Intelligenz passenden Patente zu finden. powered by
Markieren Sie Textabschnitte, um KI-gestützt weitere passende Inhalte zu finden. powered by
In this paper we present the integration of graph-based visual perception to spoken conversation in human-robot interaction. The proposed architecture has a dialogue manager as the central component for the multimodal interaction, which directs the robot’s behavior in terms of the intentions and actions associated to the conversational situations. We tested this ideas on a mobile robot programmed to act as a visitor’s guide to our department of computer science.