2000 | OriginalPaper | Buchkapitel
Gaze and Speech in Attentive User Interfaces
verfasst von : Paul P. Maglio, Teenie Matlock, Christopher S. Campbell, Shumin Zhai, Barton A. Smith
Erschienen in: Advances in Multimodal Interfaces — ICMI 2000
Verlag: Springer Berlin Heidelberg
Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.
Wählen Sie Textabschnitte aus um mit Künstlicher Intelligenz passenden Patente zu finden. powered by
Markieren Sie Textabschnitte, um KI-gestützt weitere passende Inhalte zu finden. powered by
The trend toward pervasive computing necessitates finding and implementing appropriate ways for users to interact with devices. We believe the future of interaction with pervasive devices lies in attentive user interfaces, systems that pay attention to what users do so that they can attend to what users need. Such systems track user behavior, model user interests, and anticipate user desires and actions. In addition to developing technologies that support attentive user interfaces, and applications or scenarios that use attentive user interfaces, there is the problem of evaluating the utility of the attentive approach. With this last point in mind, we observed users in an “office of the future”, where information is accessed on displays via verbal commands. Based on users’ verbal data and eye-gaze patterns, our results suggest people naturally address individual devices rather than the office as a whole.