It is our great pleasure to welcome you to ICMI 2003, the Fifth International Conference on Multimodal Interfaces, held during 5-7 November 2003 in Vancouver, Canada, and sponsored by the Association for Computing Machinery (ACM) and ACM SIGCHI.This year's meeting is being collocated with the ACM UIST conference, which will be held immediately prior to ICMI in the same location. The first two papers of ICMI are to be presented in a joint session of the two conferences.We are grateful to the National Science Foundation, Microsoft Corporation, Nissan Motor Company, NEWMic, General Motors, Mitsubishi Electric Research Labs, the Oregon Health & Science University, and Natural Interaction Systems for their generous support.Over 130 submissions were reviewed in a double-blind process by an international program committee. From this pool 32 long papers and 13 short papers were selected for inclusion in the proceedings. Presentations at the conference were assigned to either oral or poster presentation at the conference based on the judgment of the program committee as to the most effective mode of communicating the work, and not based on a judgment of relative quality. Demonstrations were invited to be presented at the conference, and a two-page abstract describing the work optionally included in this proceedings.The conference features invited talks from prominent researchers involved in three key areas of multimodal interfaces: Sandra Marshall on evaluating interfaces with eye tracking, Charles Spence on designing for cross-modal attention, and Anil Jain on multibiometrics and user identification. Abstracts for their keynote addresses are included in these proceedings.
Multimodal user interfaces: who's the user?
A wide variety of systems require reliable personal recognition schemes to either confirm or determine the identity of an individual requesting their services. The purpose of such schemes is to ensure that only a legitimate user, and not anyone else, ...
New techniques for evaluating innovative interfaces with eye tracking
Computer interfaces are changing rapidly, as are the cognitive demands on the operators using them. Innovative applications of new technologies such as multimodal and multimedia displays, haptic and pen-based interfaces, and natural language exchanges ...
Crossmodal attention and multisensory integration: implications for multimodal interface design
One of the most important findings to emerge from the field of cognitive psychology in recent years has been the discovery that humans have a very limited ability to process incoming sensory information. In fact, contrary to many of the most influential ...
Cited By
- Barbé J, Spaggiari L, Clay A, Bérard P, Aissani A and Mollard R Why and how to study multimodal interaction in cockpit design Proceedings of the 15th Ergo'IA "Ergonomie Et Informatique Avancée" Conference, (1-8)
-
Sutcliffe A and Al-Qaed F (2007). Investigating Effective ECAs: An Experiment on Modality and Initiative Human-Computer Interaction – INTERACT 2007, 10.1007/978-3-540-74800-7_38, (425-438),