Ausgabe 4/2017
Semantic Interpretation of Multi-Modal Human-Behaviour Data
Inhalt (11 Artikel)
Editorial
Semantic Interpretation of Multi-Modal Human-Behaviour Data
Mehul Bhatt, Kristian Kersting
Technical Contribution
Automated interpretation of eye–hand coordination in mobile eye tracking recordings
Moritz Mussgnug, Daniel Singer, Quentin Lohmeyer, Mirko Meboldt
Technical Contribution
Automatic Detection of Visual Search for the Elderly using Eye and Head Tracking Data
Michael Dietz, Daniel Schork, Ionut Damian, Anika Steinert, Marten Haesner, Elisabeth André
Technical Contribution
Assigning Group Activity Semantics to Multi-Device Mobile Sensor Data
Seng W. Loke, Amin Bakshandeh Abkenar
Research Project
Red Hen Lab: Dataset and Tools for Multimodal Human Communication Research
Jungseock Joo, Francis F. Steen, Mark Turner
Doctoral and Postdoctoral Dissertations
Hierarchical Hybrid Planning for Mobile Robots
Sebastian Stock
Doctoral and Postdoctoral Dissertations
Using Ontology-Based Data Access to Enable Context Recognition in the Presence of Incomplete Information
Veronika Thost
Community