Skip to main content
Top

2012 | Book

Immersive Multimodal Interactive Presence

insite
SEARCH

About this book

Immersive Multimodal Interactive Presence presents advanced interdisciplinary approaches that connect psychophysical and behavioral haptics research with advances in haptic technology and haptic rendering.

It delivers a summary of the results achieved in the IMMERSENCE European project and includes selected chapters by international researchers. Organized into two parts: I. Psychophysical and Behavioral Basis and II. Technology and Rendering, it is an excellent example of interdisciplinary research directed towards the advancement of multimodal immersive virtual environments with particular focus on haptic interaction.

The twelve chapters of the book are grouped around three different scenarios representing different types of interactions in virtual environments: Person-Object (PO), Person-Object-Person (POP) and Person-Person (PP) interaction. Recent results of psychophysical and behavioral studies are reported along with new technological developments for haptic displays and novel haptic rendering techniques.

Table of Contents

Frontmatter
Chapter 1. Introduction
Abstract
Research on high-fidelity visual and audio feedback has dominated the design and development of virtual environments for nearly half a century. Over the last decade the importance of haptic feedback has been recognized and extensive effort has been devoted to haptic interaction research. This book explores haptic feedback in virtual environments based on three different scenarios involving Person-Object (PO), Person-Object-Person (POP) and Person-Person (PP) interactions. The book highlights the interdisciplinary nature of haptic interaction research and reports recent results of psychophysical and behavioral studies along with new technological developments for haptic displays and novel haptic rendering techniques. In this introductory chapter, we provide motivation for this interdisciplinary approach followed by a detailed description of chapters composing this book.
Christos D. Giachritsis, Angelika Peer

Psychophysical and Behavioral Basis

Frontmatter
Chapter 2. Active Movement Reduces the Tactile Discrimination Performance
Abstract
Self-performed arm movements are known to increase the tactile detection threshold (i.e., decrease of tactile sensitivity) which is part of a phenomenon called tactile suppression. Today, the origin and the effects of tactile suppression are not fully understood. Tactile discrimination tasks have been utilized to quantify the changes in tactile sensitivity due to arm movements and to identify the origin of tactile suppression. The results show that active arm movement also increases tactile discrimination thresholds which has never been shown before. Furthermore, it is shown that tactile sensitivity drops at approximately 100 ms before the actual arm movement. We conclude that tactile suppression has two origins: (1) a movement related motor command which is a neuronal signal that can be measured 100 ms before a muscle contraction. This motor command is the origin for the increase of the discrimination threshold prior to the arm movement and (2) task irrelevant sensory input which reduces tactile sensitivity after the onset of the arm movement.
Marco P. Vitello, Michael Fritschi, Marc O. Ernst
Chapter 3. Weight Perception with Real and Virtual Weights Using Unimanual and Bimanual Precision Grip
Abstract
Accurate weight perception is crucial for effective object manipulation in the real world. The design of force feedback haptic interfaces for precise manipulation of virtual objects should take into account how weight simulation is implemented when manipulation involves one hand or two hands with one or two objects. The importance of this is apparent in tasks where the user requires applying vertical forces to penetrate a surface with a tool or splice a fragile object using one and/or two hands. Accurate perception of simulated weight should allow the user to execute the task with high precision. Nonetheless, most commonly used force feedback interfaces for direct manipulation of virtual object use a thimble through which the users interacts with the virtual object. While this allows use of proprio-ceptive weight information, it may reduce the cutaneous feedback that also provides weight information through pressure and skin deformation when interacting with real objects. In this chapter, we present research in unimanual and bimanual weight perception with real and virtual objects and examine the magnitude of grip forces involved and relative contribution of cutaneous and proprioceptive information to weight perception.
Christos D. Giachritsis, Alan M. Wing
Chapter 4. The Image of Touch: Construction of Meaning and Task Performance in Virtual Environments
Abstract
One of the central questions in sensory interaction is how an image of the environment is constructed. In this chapter we report results of a series of studies conducted in a telesurgery system, aimed at exploring the underpinning cognitive process of construction of meaning through haptic dynamic patterns. We use a double analysis, the first rooted in behavioral methodology and the second rooted in signal processing methodologies, to validate the existence of a haptic language, and apply the results to challenging contexts in which haptic cues are on a subliminal level, in order to identify if and what is the role of subliminal cues in a haptic perceptual primitive language. We show that although participants are not aware of subliminal cues, nevertheless both action and perception are affected, suggesting that meaning is extracted from subliminal haptic patterns. Conclusion suggests a unified framework of a haptic language mechanism for construction of mental models of the environment, rooted in the embodied cognition, and enactive theoretical framework.
Miriam Reiner
Chapter 5. Psychological Experiments in Haptic Collaboration Research
Abstract
This chapter discusses the role of psychological experiments when designing artificial partners for haptic collaboration tasks, such as joint object manipulation. After an introduction, which motivates this line of research and provides according definitions, the first part of this chapter presents theoretical considerations on psychological experiments in the general design process of interactive artificial partners. Furthermore, challenges related to the specific research interest in haptic collaboration are introduced. Next, two concrete examples of psychological experiments are given: (a) A study where we examine whether dominance behavior depends on the interacting partner. The obtained results are discussed in relation to design guidelines for artificial partners. (b) An experiment which focuses on the evaluation of different configurations of the implemented artificial partner. Again the focus is on experimental challenges, especially measurements. In the conclusion the two roles of experiments in the design process of artificial partners for haptic tasks are contrasted.
Raphaela Groten, Daniela Feth, Angelika Peer, Martin Buss
Chapter 6. Human-Robot Adaptive Control of Object-Oriented Action
Abstract
This chapter is concerned with how implicit, nonverbal cues support coordinated action between two partners. Recently, neuroscientists have started uncovering the brain mechanisms involved in how people make predictions about other people’s behavioural goals and intentions through action observation. To date, however, only a small number of studies have addressed how the involvement of a task partner influences the planning and control of one’s own purposeful action. Here, we review three studies of cooperative action between human and robot partners that address the nature of predictive and reactive motor control in cooperative action. We conclude with a model which achieves motor coordination by task partners each adjusting their actions on the basis of previous trial outcome.
Satoshi Endo, Paul Evrard, Abderrahmane Kheddar, Alan M. Wing
Chapter 7. Cooperative Physical Human-Human and Human-Robot Interaction
Abstract
This chapter examines the physical interaction between two humans and between a human and a robot simulating a human in the absence of all other modes of interaction, such as visual and verbal. Generally, when asked, people prefer to work alone on tasks requiring accuracy. However, as demonstrated by the research in this chapter, when individuals are placed in teams requiring physical cooperation, their performance is frequently better than their individual performance despite perceptions that the other person was an impediment. Although dyads are able to perform certain actions significantly faster than individuals, dyads also exert large opposition forces. These opposition forces do not contribute to completing the task, but are the sole means of haptic communication between the dyads. Solely using this haptic communication channel, dyads were able to temporally divide the task based on task phase. This chapter provides further details on how two people haptically cooperate on physical tasks.
Kyle B. Reed

Technology and Rendering

Frontmatter
Chapter 8. Data-Driven Visuo-Haptic Rendering of Deformable Bodies
Abstract
Our current research focuses on the investigation of new algorithmic paradigms for the data-driven generation of sensory feedback. The key notion is the collection of all relevant data characterizing an object as well as the interaction during a recording stage via multimodal sensing suites. The recorded data are then processed in order to convert the raw signals into abstract descriptors. This abstraction then also enables us to provide feedback for interaction which has not been observed before. We have developed a first integrated prototype implementation of the envisioned data-driven visuo-haptic acquisition and rendering system. It allows users to acquire the geometry and appearance of an object. In this chapter we outline the individual components and provide details on necessary extensions to also accommodate interaction scenarios involving deformable objects.
Matthias Harders, Raphael Hoever, Serge Pfeifer, Thibaut Weise
Chapter 9. Haptic Rendering Methods for Multiple Contact Points
Abstract
Vast majority of haptic applications are focused on single contact point interaction between the user and the virtual scenario. One contact point is suitable for haptic applications such as palpation or object exploration. However, two or more contact points are required for more complex tasks such as grasping or object manipulation. Traditionally, when single-point haptic devices are applied in a complex task, a key or a switch is used for grasping objects. Using multiple contact points for this kind of complex manipulation tasks significantly increases the realism of haptic interactions. Moreover, virtual scenarios with multiple contact points also allow developing multi-user cooperative virtual manipulation since several users simultaneously interact over the same scenario and perceive the actions being performed by others. It represents a step forward in the current haptic applications that are usually based on one single user.
Recreating these scenarios in a stable and realistic way is a very challenging goal due to the complexity of the computational models that requires integrating all interactions of multiple haptic devices and calculations of the corresponding actions over the virtual object scenarios. It also requires solving mathematical equations in real time that properly represent the behavior of virtual objects. Delays in these calculations can lead to instabilities that may produce vibrations in the haptic devices and reduce the realism of the simulations. In order to offset these problems, different kinds of solutions have been considered, such as (i) models based on simplified calculations of forces and torques that are involved in object interactions, (ii) models based on virtual coupling between objects in order to ensure stability of the simulation.
Experiments shown in this work have been performed by using a multifinger haptic device called MasterFinger-2(MF-2). Is has been applied in different kinds of multiple contact point applications.
Manuel Ferre, Pablo Cerrada, Jorge Barrio, Raul Wirz
Chapter 10. Artificially Rendered Cutaneous Cues for a New Generation of Haptic Displays
Abstract
In this chapter we report on two architectures of haptic devices able to reproduce variable softness and elicit tactile sensations. Both solutions aim at addressing more effectively cutaneous channels. The first device is comprised of a tactile flow-based display coupled with a commercial kinesthetic interface. The second device is based on a pin array configuration in order to stimulate locally the fingertips and induce the illusion of different shapes or moving objects.
Enzo Pasquale Scilingo, Matteo Bianchi, Nicola Vanello, Valentina Hartwig, Luigi Landini, Antonio Bicchi
Chapter 11. Social Haptic Interaction with Virtual Characters
Abstract
Adding physicality to virtual environments is considered a prerequisite to achieve natural interaction behavior. While physical properties and laws can be built into virtual environments by means of physical engines, providing haptic feedback to the user requires appropriately designed and controlled haptic devices, as well as sophisticated haptic rendering algorithms. While in the past a variety of haptic rendering algorithms for the simulation of human-object interactions were developed, haptic interactions with a virtual character are still underinvestigated. Such kind of interactions, however, pose a number of new challenges compared to the rendering of human-object interactions as the human expects to interact with a character that shows human-like behavior, i.e., it should be able to estimate human intentions, to communicate intentions, and to adapt its behavior to its partner. On this account, algorithms for intention recognition, interactive path planning, control, and adaptation are required when implementing such interactive characters. In this chapter two different approaches for the design of interactive behavior are reviewed, an engineering-driven and a human-centred approach. Following the latter approach virtual haptic interaction partners are realized following the workflow record-replay-recreate. To demonstrate the validity of this approach it is applied to two prototypical application scenarios, handshaking and dancing.
Zheng Wang, Jens Hölldampf, Angelika Peer, Martin Buss
Chapter 12. FMRI Compatible Sensing Glove for Hand Gesture Monitoring
Abstract
Here we describe and validate a fabric sensing glove for hand finger movement monitoring. After a quick calibration procedure, and by suitably processing of the outputs of the glove, it is possible to estimate hand joint angles in real time. Moreover, we tested the fMRI compatibility of the glove and ran a pilot fMRI experiment on the neural correlates of handshaking during human-to-human and human-to-robot interactions. Here we describe how the glove can be used to monitor correct task execution and to improve modeling of the expected hemodynamic responses during fMRI experimental paradigms.
Nicola Vanello, Valentina Hartwig, Enzo Pasquale Scilingo, Daniela Bonino, Emiliano Ricciardi, Alessandro Tognetti, Pietro Pietrini, Danilo De Rossi, Luigi Landini, Antonio Bicchi
Chapter 13. Improving Human-Computer Cooperation Through Haptic Role Exchange and Negotiation
Abstract
Even though in many systems, computers have been programmed to share control with human operators in order to increase task performance, the interaction in such systems is still artificial when compared to natural human-human cooperation. In complex tasks, cooperating human partners may have their own agendas and take initiatives during the task. Such initiatives contribute to a richer interaction between cooperating parties, yet little research exists on how this can be established between a human and a computer. In a cooperation involving haptics, the coupling between the human and the computer should be defined such that the computer can understand the intentions of the human operator and respond accordingly. We believe that this will make the haptic interactions between the human and the computer more natural and human-like. In this regard, we suggest (1) a role exchange mechanism that is activated based on the magnitude of the force applied by the cooperating parties and (2) a negotiation model that enables more human-like coupling between the cooperating parties. We argue that when presented through the haptic channel, the proposed role exchange mechanism and the negotiation model serve to communicate the cooperating parties dynamically, naturally, and seamlessly, in addition to improving the task efficiency of the user. In this chapter, we explore how human-computer cooperation can be improved using a role-exchange mechanism and a haptic negotiation framework. We also discuss the use of haptic negotiation in assigning different behaviors to the computer; and the effectiveness of visual and haptic cues in conveying negotiation-related complex affective states. Throughout this chapter, we will adopt a broad terminology and speak of cooperative systems, in which both parties take some part in control, as shared control schemes, but the term “control” is merely used to address the partners’ manipulation capacities on the task.
Ayse Kucukyilmaz, Salih Ozgur Oguz, Tevfik Metin Sezgin, Cagatay Basdogan
Backmatter
Metadata
Title
Immersive Multimodal Interactive Presence
Editors
Angelika Peer
Christos D. Giachritsis
Copyright Year
2012
Publisher
Springer London
Electronic ISBN
978-1-4471-2754-3
Print ISBN
978-1-4471-2753-6
DOI
https://doi.org/10.1007/978-1-4471-2754-3