Skip to main content

2014 | Buch

Haptics: Neuroscience, Devices, Modeling, and Applications

9th International Conference, EuroHaptics 2014, Versailles, France, June 24-26, 2014, Proceedings, Part I

herausgegeben von: Malika Auvray, Christian Duriez

Verlag: Springer Berlin Heidelberg

Buchreihe : Lecture Notes in Computer Science

insite
SUCHEN

Über dieses Buch

The two-volume set LNCS 8618 and 8619 constitutes the refereed proceedings of the 9th International Conference EuroHaptics 2014, held in Versailles, France, in June 2014.
The 118 papers (36 oral presentations and 82 poster presentations) presented were carefully reviewed and selected from 183 submissions. Furthermore, 27 demos were exhibited, each of them resulting in a short paper included in the volumes. These proceedings reflect the multidisciplinary nature of EuroHaptics and cover topics such as human-computer interaction, human-robot interactions, neuroscience, perception and psychophysics, biomechanics and motor control, modelling and simulation; and a broad range of applications in medicine, rehabilitation, art, and design.

Inhaltsverzeichnis

Frontmatter
Erratum to: Haptics: Neuroscience, Devices, Modeling, and Applications [Part II]
Malika Auvray, Christian Duriez

Fundamentals of Haptic Perception

Frontmatter
Semantically Layered Structure of Tactile Textures

This study constructed a multi-layered semantic structure of adjective pairs that are used to express the tactile properties of materials. We analyzed the causalities between adjectives using the DEMATEL method, involving twenty-nine adjective pairs and forty-six materials, and thus constructed a multi-layered structure. The constructed structure contained psychophysical, affective, and preferential layers. The psychophysical layer consisted of words related to the physical properties of materials, which were rough/smooth, uneven/flat, hard/soft, cold/warm, sticky/slippery, and wet/dry. The affective layer included affective sensations such as comfortable/uncomfortable and friendly/unfriendly. The words related to individual preferences, which include rich/poor, good/bad, like/dislike, and happy/sad, were located on the highest layer of the structure. The constructed structure helps us understand affective and preferential perceptions of materials through psychophysical perceptions.

Hikaru Nagano, Shogo Okamoto, Yoji Yamada
Roughness Modulation of Real Materials Using Electrotactile Augmentation

In this paper, we present a roughness modulation technique that employs electrotactile augmentation to alter material texture perception, which is conducted through mechanically unconstrained touch. A novel electrotactile augmented reality system that superimposes modulating nerve activity onto afferent nerves at the middle phalanx of a finger is described. We conducted a user study in which participants were requested to rate the roughness of real materials that were explored using the system. The results indicated that participants could perceive the modulated fine- and macro-roughness via the electrotactile augmentation.

Shunsuke Yoshimoto, Yoshihiro Kuroda, Yuki Uranishi, Masataka Imura, Osamu Oshiro
Proprioceptive Biases in Different Experimental Designs

Systematic biases have been found when matching proprioceptive and visual locations. In this study we investigated whether such visuo-haptic biases depend on the indicator that subjects use and on whether targets are presented on a board (2D) or in free space (3D). Subjects had to move their unseen hand to visually presented targets. They indicated the perceived target position either with their index finger or with a hand-held device, and either on a surface or in free space. We found no differences between the magnitudes of the biases or the directions of the biases. Thus, the results of studies on visuo-haptic biases is not merely dependent on the used experimental designs and can be generalized.

Irene A. Kuling, Marieke C. W. van der Graaff, Eli Brenner, Jeroen B. J. Smeets
Delayed Haptic Feedback to Gaze Gestures

Haptic feedback can improve the usability of gaze gestures in mobile devices. However, the benefit is highly sensitive to the exact timing of the feedback. In practical systems the processing and transmission of signals takes some time, and the feedback may be delayed. We conducted an experiment to determine limits on the feedback delays. The results show that when the delays increase to 200 ms or longer the task completion times are significantly longer than with shorter delays.

Jari Kangas, Jussi Rantala, Deepak Akkil, Poika Isokoski, Päivi Majaranta, Roope Raisamo
A System for Evaluating Tactile Feelings Expressed by Sound Symbolic Words

Subjective evaluations of tactile sensations have been quantified using several pairs of adjective words describing material properties such as roughness or smoothness. However, this method forces participants to analyze their tactile experience for as many bipolar adjective scales as to grasp variations of tactile sensations. Here we propose a system that evaluates tactile feelings expressed by a single sound symbolic word. When a user inputs a sound symbolic word into the system, the system refers to a database and estimates the relationship between the sounds used in the expression and its meaning. The result is expressed by 26 pairs of scales, such as “rough - smooth”, “warm - cold”, and “soft - hard”.

Ryuichi Doizaki, Junji Watanabe, Maki Sakamoto
Does Just Noticeable Difference Depend on the Rate of Change of Kinesthetic Force Stimulus?

In all prior applications, it is assumed that the just noticeable difference (JND) for the kinesthetic force stimulus is independent of rate of change of the stimulus. In this work, we study how the JND is affected over the rate of change of the stimulus. This study has a possible application in better design of a haptic data compression algorithm. We design an experimental set up where users are subjected to a continuous haptic force which starts increasing or decreasing from a fixed reference force value, and record the haptic responses. A machine learning classifier- SVM is trained using the recorded haptic responses to estimate the best fit linear decision boundary between the JND and the rate of change of the stimulus. Our results show that the JND decreases for faster change in the stimulus. We also demonstrate an asymmetric behavior of perception between increasing and decreasing cases.

Amit Bhardwaj, Subhasis Chaudhuri
Subject-Specific Distortions in Haptic Perception of Force Direction

In a previous study, we found that the accuracy of human haptic perception of force direction is not very high. We also found an effect of physical force direction on the error subjects made, resulting in ‘error patterns’. In the current study, we assessed the between- and within-subject variation of these patterns. The within-subject variation was assessed by measuring the error patterns repeatedly over time for the same set of subjects. Many of these patterns were correlated, which indicates that they are fairly stable over time and thus subject-specific. The between-subject analysis, conversely, yielded hardly any significant correlations. We also measured general subject parameters that might explain this between-subject variation, but these parameters did not correlate with the error patterns. Concluding, we found that the error patterns of haptic perception of force direction are subject-specific and probably governed by an internal subject parameter that we did not yet discover.

Femke E. van Beek, Wouter M. Bergmann Tiest, Frank L. Gabrielse, Bart W. J. Lagerberg, Thomas K. Verhoogt, Bart G. A. Wolfs, Astrid M. L. Kappers
Perceptual Evaluation of the Passive/Active Torque and Stiffness Asymmetry of a Hybrid Haptic Device

Hybrid haptic interfaces combining brakes and motors can present dissimilar torque and stiffness capabilities when dissipating or restoring energy. This paper aims at identifying the asymmetry thresholds that lead to an alteration in the perception of elasticity simulated by such devices. 17 subjects took part in an experiment consisting in interacting with virtual springs with either controllable stiffness or torque asymmetry levels, and identifying if the springs were symmetric or not. Experimental results indicate that when the decompression stiffness or torque were less than 80 % and 60 % of the compression stiffness or torque respectively, users did not perceive the asymmetry in 80 % of trials. This suggests that hybrid devices can present dissimilar active/passive torque or stiffness capabilities without affecting the perception of elasticity.

Carlos Rossa, Margarita Anastassova, Alain Micaelli, José Lozada
Vibrotactile Frequency Discrimination Performance with Cross-Channel Distractors

To obtain accurate surface information about an object by touch, we have to selectively attend to one source of tactile sensory information while ignoring the influence of others that occur at the same time (e.g., the sensation of of a shirt sleeve contacting the skin.) To investigate how the brain isolates different events, we examined the frequency discrimination performance using vibrotactile stimuli presented to one finger while presenting distractor stimuli to another finger. Results show that the perceived frequency of target stimulus shifted toward the frequency of the distractor. The shift of perceived frequency occurred even when the distractor was presented to the other hand. Meanwhile, the distractor effect disappeared when the onset of the distractor was prior to that of the target stimulus. Our results indicate the possibility that the brain uses temporal synchrony between stimuli as a strong cue to associate multiple stimuli as a single event.

Scinob Kuroki, Junji Watanabe, Shin’ya Nishida
Weights in Visuo-Haptic Softness Perception are not Sticky

The contribution of vision to visuo-haptic softness judgments has been observed to be non-optimal and it has been speculated that the visual weights are “sticky”, i.e., do not account for the senses’ reliabilities [

1

,

2

]. The present study tested the hypothesis of sticky weights by varying the quality of the visual information. Participants discriminated between the softness of two objects with deformable surfaces using only visual, only haptic, or bisensory information. Visually, we displayed the finger’s positions and stimulus deformations in a noisy or precise quality or in a precise quality enhanced by visual force information from color changes of the finger nail. We assessed the reliabilities of the judgments using the method of constant stimuli. In bisensory conditions, discrepancies between the two senses’ information were used to measure each sense’s weight. The reliability of visual judgments was lower with noisy as compared to precise position information, visual force information did not affect reliability. The reliability of bisensory judgments was suboptimal and visual weights were higher than optimal. Not as expected, the visual weights shifted with the visual reliability. The results confirm that visuo-haptic integration of softness information is suboptimal and biased towards vision, but with weights that are “lazy” rather than sticky.

Knut Drewing, Onno Kruse
Haptic Shape Constancy Across Distance

To explore haptic shape constancy across distance, we measured perceived curvature thresholds of cylindrical shapes, cut out of acetal resin blocks. On each trial, blindfolded observers used their bare finger to scan the surface of two shapes consecutively. One shape was close to the observer and the other positioned further away. This spatial displacement changes the available proprioceptive information about the object shape, and therefore the combined proprio-tactile information may signal different objects at the two distances. The results reveal a perceptual compensation for the change in proprioceptive information. However, two distinct patterns of distance compensation emerged: the data from one group are consistent with predictions from visual object constancy. The other group of observers demonstrate the reverse pattern of response such that objects further away need to have lower curvature to be perceived having equal curvature. We suggest that perceived haptic curvature across distance depends on observers’ differential weighting of the multiple available cues.

Jess Hartcher-O’Brien, Alexander Terekhov, Malika Auvray, Vincent Hayward
Response Time-Dependent Force Perception During Hand Movement

For the perception of haptic environmental properties such as stiffness, damping, or inertia, estimates of force and movement must be combined continuously over time. We investigate the relations between sensitivity of perceptual judgments about force and the time a perceptual response is given with different types of hand movements. Portions of response data are selected according to their response time and psychometric functions are fitted. In this way, we can estimate time-dependent JND and PSE functions. We show that the JND is different depending on which portion on the response time data it is based on. The JND follows the same pattern for responses given after 650 ms irrespective of the movement performed. Furthermore, we find that forces are consistently overestimated during movement.

Markus Rank, Massimiliano Di Luca
Amplitude and Duration Interdependence in the Perceived Intensity of Complex Tactile Signals

The dependency of the perceived intensity of a short stimulus on its duration is well established in vision and audition. No such phenomenon in terms of the amplitude has been reported for the tactile modality. In this study naive observers were presented with pink noise vibrations enveloped in a Gabor wavelet. Characteristic durations ranging between 0.1 and 0.7 s and intensities ranging from 0.3 and 3.0

$$10^{-3}$$

m/s

$$^2$$

were presented to the fingertip. Using a two alternative forced choice staircase procedure, the points of subjective equivalence were estimated for the 0.4 s long reference stimulus. Similarly to vision and audition, lower intensities were consistently reported for shorter stimuli. The observed relationship could be interpreted as reflecting a mechanism of haptic constancy with respect to exploration speed.

Séréna Bochereau, Alexander Terekhov, Vincent Hayward
Multi-digit Position and Force Coordination in Three- and Four-Digit Grasping

In this paper, we investigate the redundancy problem by examining how humans control grasping of a hand-held object using three- and four-digits in response to external perturbations. Our results revealed a similar variability in digits’ initial placements when grasping with three- and four-digits. Moreover, the distribution of digit normal forces were modulated depending on the number of digits used, their locations, and the type of the external perturbations. Our results suggest that the redundancy problem was addressed by the central nervous system in a similar fashion on a trial-to-trial basis in terms of digits’ initial placements, but differently in terms of digits normal force distribution that was controlled online depending on the number of digits actively involved in the grasp.

Abdeldjallil Naceri, Alessandro Moscatelli, Marco Santello, Marc O. Ernst
Role of Occlusion in Non-Coulombic Slip of the Finger Pad

Understanding how fingers slip on surfaces is essential for elucidating the mechanisms of haptic perception. This paper describes an investigation of the relationship between occlusion and the non-Coulombic slip of the finger pad, which results in the frictional force being a power law function of the normal load, with an index

$$ n $$

; Coulombic slip corresponds to

$$ n = 1 $$

. For smooth impermeable surfaces, occlusion of moisture excreted by the sweat glands may cause up to an order of magnitude increase in the coefficient of friction with a characteristic time of ~20 s. This arises because the moisture plasticises the asperities on the finger print ridges resulting in an increase in their compliance and hence an increase in the contact area. Under such steady state sliding conditions a finger pad behaves like a Hertzian contact decorated with the valleys between the finger print ridges, which only act to reduce the true but not the nominal contact area. In the limit, at long occlusion times (~50 s), it can be shown that the power law index tends to a value in the range

$$ {2 \mathord{\left/ {\vphantom {2 {3 \le n \le 1}}} \right. \kern-0pt} {3 \le n \le 1}} $$

. In contrast, measurements against a rough surface demonstrate that the friction is not affected by occlusion and that a finger pad exhibits Coulombic slip.

Brygida Maria Dzidek, Michael Adams, Zhibing Zhang, Simon Johnson, Séréna Bochereau, Vincent Hayward
An Analysis of the Influence of a Pseudo-haptic Cue on the Haptic Perception of Weight

Haptics provides powerful cues about forces but cannot easily be integrated in all relevant applications, such as education. Pseudo-haptic cues, visual information that simulate haptic sensations, have been raised as an alternative. It is, however, largely unknown how (or even if) pseudo-haptic cues are perceived by the haptic sensory modality. In this paper we present an approach that applies theories on multimodal integration to testing if a pseudo-haptic cue is triggering haptic perception. This approach is subsequently applied in designing an experiment that tests a pseudo-haptic cue based on a visual force-causes-displacement metaphor, similar to a rubber band.

Karljohan Lundin Palmerius, Daniel Johansson, Gunnar Höst, Konrad Schönborn
Perceived Softness of Composite Objects

What is the apparent softness of a grasped object composed of two compliant materials? Experimental data indicates that perceived softness of a composite object depends on how the object is grasped and how it is oriented. If the object is grasped with a precision grip using index and thumb, turning around the object leads to a consistent change in overall perceived softness. Namely, the composite object seems softer when the index is in contact with the more compliant material than when it is in contact with the stiffer material. Importantly, such a difference in perceived softness due to object orientation is not present when the precision grip is obtained by opposing index and middle fingers to the thumb.

Massimiliano Di Luca
Evaluating Virtual Embodiment with the ALEx Exoskeleton

The assessment of virtual embodiment has focused primarily on experimental paradigms based on multisensory congruent cues, such as auditory, tactile, visual and motor, mainly due to the technological limitations of haptic feedback. In this work virtual embodiment in an avatar is assessed by means of a new lightweight exoskeleton (ALEx) with a focus on the perception of danger and aggressive behavior. In particular an experiment has been designed assessing the effectiveness of haptic feedback while interacting with an opponent avatar. Experiments are evaluated based on physiological measures and questionnaires.

Emanuele Ruffaldi, Michele Barsotti, Daniele Leonardis, Giulia Bassani, Antonio Frisoli, Massimo Bergamasco
Vibrotactile Stimuli for Distinction of Virtual Constraints and Environment Feedback

In virtual reality and teleoperation scenarios,

active constraints

can be used to guide a user, prevent him from entering forbidden regions, and assist him in general. On a haptic interface, this implies superimposing the forces generated by the active constraints on top of the forces that are generated by the environment (real or virtual). This creates the problem of distinguishing from which of these two sources does a stimuli come from while perceiving feedback.

In this paper, we present an approach that consists in adding a vibrating component on top of the forces generated by the virtual constraints, while the forces generated by the environment are kept untouched. Blind experiments in which users have to navigate through a scenario containing both active constraints and randomly positioned objects show that they manage to perceive more successfully the presence of the unexpected objects with our approach than with previously existing ones. Moreover, the penetration into the constraints is as good as with the classical approach.

Adrian Ramos, Domenico Prattichizzo
Rendering of Virtual Walking Sensation by a Passive Body Motion

This paper describes a rendering technique of a virtual walking sensation for the seated user on a multisensory VR (virtual reality) display system. A basic approach is to move a participant’s body mechanically to produce the sensation of walking where the participant’s body is considered as a part of display media that projects the sensation of a body motion to the brain. A motion seat was built and used to create lift and roll motions in the present study. As a basic reference data, a real walking motion on a treadmill was measured. The lift and roll motions were independently investigated to clarify characteristics of each presentation. The optimal seat motion for the sensation of walking was as small as about one fifteenth to a fifth of the real walking motion.

Yasushi Ikei, Seiya Shimabukuro, Shunki Kato, Yujiro Okuya, Koji Abe, Koichi Hirota, Tomohiro Amemiya

Human Computer Interaction

Frontmatter
Touch Accessibility on the Front and the Back of Held Tablet Devices

Direct touch cannot reach the entire interaction areas of hand-held tablet. This paper explores which areas are accessible for direct touch on the front as well as on the back of tablets. The insights gained can serve as base for interactions designers to place GUI widgets, highlight areas that are hardly touchable, and thus, motivate further research on indirect pointing techniques for touch interactions.

Katrin Wolf, Robert Schleicher, Michael Rohs
Multisensory Memory for Object Identity and Location

Researchers have reported that audiovisual object presentation improves memory encoding of object identity in comparison to either auditory or visual object presentation. However, multisensory memory effects on retrieval, on object location, and of other multisensory combinations are yet unknown. We investigated the effects of visuotactile presentation on the encoding

and

retrieval of object identity memory

and

object location memory. Participants played an electronic memory card game consisting of an encoding and retrieval phase. In the encoding phase (c) they explored four game cards, which were presented on a computer screen in a two by two arrangement. Participants could touch each card to experience a Morse code presented on the screen (V) and/or via a tactile vibrator attached to the participant’s index finger (T). In the retrieval phase (r), they had to indicate for eight cards if (recognition) and where (relocation) these had been presented earlier. Compared with the visual base line (cV-rV), we found that both ‘multisensory encoding’ (cVT-rV) and ‘multisensory retrieval’ (cV-rVT) significantly improved both recognition and relocation performance. Compared with the tactile base line (cT-rT), we found no multisensory encoding or retrieval effects. This means that vision can benefit from adding touch but not vice versa. We conclude that visuotactile presentation improves memory encoding

and

retrieval of object identity

and

location. However, it is not yet clear whether these benefits are due to multisensory integration or simply due to the processing of the same information in multiple sensory modalities.

Jan B. F. van Erp, Tom G. Philippi, Peter J. Werkhoven
Interaction-Based Dynamic Measurement of Haptic Characteristics of Control Elements

The force-displacement curve is typically used today to haptically characterize control elements. Elements with the same curve, however, may still lead to quite different percepts because the curve describes only static information. To overcome this limitation, a new dynamic measurement method is introduced. It can directly measure dynamic interaction signals between a finger mimicking measurement device and control elements. Using this measurement method, also novel control elements like touchpads with active haptic feedback can be technically specified and evaluated for the first time.

Wenliang Zhou, Jörg Reisinger, Angelika Peer, Sandra Hirche
Optimal Exploration Strategies in Haptic Search

In this study, we have investigated which strategies were optimal in different haptic search tasks, where items were held in the hand. Blindfolded participants had to detect the presence of a target among distractors while using predetermined strategies. The optimal strategy was determined by considering reaction time and error data. In the search for salient targets, a rough sphere among smooth or a cube among spheres, parallel strategies were optimal. With non-salient targets, the results were different. In the search for a sphere among cubes, a parallel strategy was effective, but only if the fingers could be used. In the search for a smooth sphere among rough, only a serial strategy was successful. Thus, the optimal strategy depended on the required perceptual information.

Vonne van Polanen, Wouter M. Bergmann Tiest, Noortje Creemers, Merel J. Verbeek, Astrid M. L. Kappers
Encountered-Type Haptic Interface for Grasping Interaction with Round Variable Size Objects via Pneumatic Balloon

This paper presents the concept and implementation of an encountered-type haptic interface that enables interaction with soft round-shaped objects. The core part of the system is a silicone balloon whose size is controlled by a pneumatic actuator in accordance with the size of a virtual object being touched. The balloon is mounted on a position-controlled robotic arm, and its position is controlled so that it follows a user’s hand. The user encounters the balloon only when the proxy representing the hand in the virtual world is in contact with the virtual object. Our new haptic system allow the user not only to grasp a soft object with variable size, but also to follow the contour of the virtual object. The accuracy and control bandwidth of the rendering result was measured and shows that the radius of the balloon was accurately controlled with high speed. The potential of the system is also demonstrated by an example interaction scenario: interaction with human avatar’s arm.

Noman Akbar, Seokhee Jeon
Hand-Skill Learning Using Outer-Covering Haptic Display

During hand-skill learning, it is important for the learner to manipulate a tool with the appropriate amount of force. Haptic training systems in which a tool is directly actuated are therefore undesirable due to the fact that the force applied to the tool is assisted by an actuator. We report a method in which we have replaced guiding by a tool with an outer-covering haptic display in which a guidance sensation was imparted to the back of a learner’s hand. To show the effectiveness of this approach for hand-skill learning, we conducted experiments with comparisons to two existing methods: a haptic guidance system in which the tool is directly actuated and a visual information guidance system.

Vibol Yem, Hideaki Kuzuoka, Naomi Yamashita, Shoichi Ohta, Yasuo Takeuchi
Audio-Haptic Car Navigation Interface with Rhythmic Tactons

While car environment is often noisy and driving requires visual attention, still navigation instructions are given with audio and visual feedbacks. By using rhythmic tactons together with audio, navigation task could be supported better in the driving context. In this paper we describe haptic-audio interface with simple two-actuator setup on the wheel using rhythmic tactons for supporting navigation in the car environment. The users who tested the interface with a driving game would choose audio-haptic interface over audio only interface for a real navigation task.

Toni Pakkanen, Roope Raisamo, Veikko Surakka
Localization Ability and Polarity Effect of Underwater Electro-Tactile Stimulation

In this paper, we describe a method for presenting underwater tactile electrical stimulation for in-bath entertainment. We investigated the localization abilities of participants and the polarity effect of the stimulation, and found that underwater electro-tactile anodic stimulation produced a stronger sensation than did cathodic stimulation. Furthermore, we found that the participants were able to successfully identify the direction of the tactile stimulation and the direction of rotation during anodic stimulation.

Taira Nakamura, Manami Katoh, Taku Hachisu, Ryuta Okazaki, Michi Sato, Hiroyuki Kajimoto
Perceptual Strategies Under Constrained Movements on a Zoomable Haptic Mobile Device

This study shows that zooming and pointing strategies are influenced by visual constraints when using a haptic mobile device. Participants were required to point on invisible targets that were only detectable via a tactile feedback. Movements were either constrained or unconstrained. Results revealed that pointing and zooming strategies depended on the order of training. Participants who started their training with unconstrained movements, kept using the same strategies even when constraints have been removed. This suggests that constrained movements allowed participants to explore other strategies that would have not been available and extended their repertoire of exploratory strategies related to the haptic zoom.

Mounia Ziat, Eric Lecolinet, Olivier Gapenne, Gerard Mouret, Charles Lenay
Improved Haptic Music Player with Auditory Saliency Estimation

This paper presents improvements made on our previous haptic music player designed to enhance music listening experience with mobile devices. Our previous haptic music player featured with dual-band rendering; it delivers bass beat sensations in music with rough superimposed vibrations and high-frequency salient features in music with high-frequency smooth vibrations. This work extends the previous algorithm by taking into account

auditory saliency

in determining the intensity of vibration to be rendered. The auditory saliency is estimated in real-time from several auditory features in music. The feasibility of multiband rendering was also tested using a wideband actuator. We carried out a user study to evaluate the subjective performance of three haptic music playing modes: saliency-improved dual-band rendering, saliency-improved multiband rendering, and our previous dual-band rendering. Experimental results showed that the new dual-band mode has perceptual merits over the multiband mode and the previous dual-band mode, particularly for rock or dance music. The results can contribute to enhancing multimedia experience by means of vibrotactile rendering of music.

Inwook Hwang, Seungmoon Choi
An Initial Study on Pitch Correction Guidance for String Instruments Using Haptic Feedback

Learning to play a string instrument takes years of practice. Novice players even find it difficult to play correct pitches, and chromatic tuners that visualize the errors in the played pitches have long been used as an effective aid. However, chromatic tuners can detract users’ visual attention from more essential visual cues such as musical scores. As an alternative, we have been developing HapTune (Haptic Tuner), which conveys the pitch errors to the user using haptic feedback. In this paper, we present an initial design of HapTune that relies on spatiotemporal information coding using two vibrotactile actuators. We also verified that the vibrotactile stimuli provided by HapTune can be easily recognized by a user experiment.

Yongjae Yoo, Seungmoon Choi
Simulated Social Touch in a Collaborative Game

In this paper we present a study in which participants played a collaborative augmented reality game together with two virtual agents, visible in the same augmented reality space. During interaction one of the virtual agents touches the user on the arm, by means of a vibrotactile display. We investigated whether social touch by a virtual agent in a collaborative setting would positively influence the participant’s perception of this touching virtual agent. Results showed that the touching virtual agent was rated higher on affective adjectives than the non-touching agent.

Gijs Huisman, Jan Kolkmeier, Dirk Heylen
Towards Palpation in Virtual Reality by an Encountered-Type Haptic Screen

Virtual Reality technology takes in an increasingly prominent role in modern medical training programs. The fidelity of the Virtual Reality environment is of crucial importance and directly relates to the quality of skill transfer. To train interventions that rely on physical interaction with the patient, high-quality haptic feedback should be provided. Palpation is a prototypical example of an important medical skill relying heavily on physical interaction. Since the interaction is quite involved and thus complex to render, very few training systems have attempted to train palpation so far and none of those provide a truly natural experience. In this work a novel haptic interface for free-hand palpation is presented. The interface follows the principle of an encountered-type robot replicating more naturally the actual palpation procedure. The concept and design of this new development are introduced. A control method to render spatially distributed kinaesthetic information is explained next. Experimental results give a proof-of-principle indicating that this approach might lead towards a more natural virtual palpation experience.

Sergio Portolés Diez, Emmanuel B. Vander Poorten, Gianni Borghesan, Dominiek Reynaerts
Haptic Expressions of Stress During an Interactive Game

The study of the potential of haptic channel to convey emotions is very promising for human-machine interaction and mediated communication. However, main researches investigated acted emotions that are not representative of natural and spontaneous behaviors. This paper addresses the issue of expression of spontaneous emotions. In the context of a game application that involves haptic interaction, a suitable scenario and context were designed to elicit a spontaneous stressed affective state. The haptic behavior of participants was subsequently analyzed in order to highlight the changes during and after the elicitation of this affective state.

Yoren Gaffary, Jean-Claude Martin, Mehdi Ammi
Grasp Mapping Between a 3-Finger Haptic Device and a Robotic Hand

This paper presents the implementation of a robust grasp mapping between a 3-finger haptic device (master) and a robotic hand (slave). Mapping is based on a grasp equivalence defined considering the manipulation capabilities of the master and slave devices. The metrics that translate the human hand gesture to the robotic hand workspace are obtained through an analytical user study. This allows a natural control of the robotic hand. The grasp mapping is accomplished defining 4 control modes that encapsulate all the grasps gestures considered.

Transition between modes guarantee collision-free movements and no damage to grasped objects. Detection of contact with objects is done by means of customized tactile sensors based on MEMS barometers.

The methodology herein presented can be extended for controlling a wide range of different robotic hands with the 3-finger haptic device.

Francisco Suárez-Ruiz, Ignacio Galiana, Yaroslav Tenzer, Leif P. Jentoft, Robert D. Howe, Manuel Ferre
The HapBand: A Cutaneous Device for Remote Tactile Interaction

In this work we present a novel haptic device that applies cutaneous force feedback to the forearm. We called it HapBand. It is composed of three moving plates, whose action on the forearm resembles the squeeze of a human hand. In order to validate the device, we carried out an experiment of remote tactile interaction. A glove, instrumented with five force sensors, registered the contact forces at the remote site, while the HapBand mimicked the registered sensation to the user’s forearm. Results showed the HapBand to well resemble the squeezing sensation on the forearm.

Francesco Chinello, Mirko Aurilio, Claudio Pacchierotti, Domenico Prattichizzo
Similarity of Blind and Sighted Subjects When Constructing Maps with Small-Area Tactile Displays: Performance, Behavioral and Subjective Aspects

Conveying spatial information to visually impaired people is possible by leveraging residual tactile abilities. It is still unclear how to effectively evaluate mental map construction beyond performance-based metrics. Here we use a minimalistic mouse-shaped tactile device to display tactile virtual objects. We study how task complexity and visual deprivation influence behavioral, subjective and performance variables both in blind and sighted subjects. Complexity shows to be a factor equally affecting both groups. As well we show that performance, amount of acquired information and subjective judgments of task difficulty do not depend on visual deprivation. Results can help with technological solutions in rehabilitation programs for impaired individuals.

Mariacarla Memeo, Claudio Campus, Laura Lucagrossi, Luca Brayda
Scale Dependence of Force Patterns During the Scanning of a Surface by a Bare Finger

This study investigated how the geometry of a touched surface influences forces felt during the scanning of an object with the finger. Although prior research, and basic mechanical considerations, have shed light on force production during haptic interaction with a touched object, surprisingly little is known about how to relate surface detail at different scales to the specific patterns of force that are observed. To address this, we designed an apparatus that could accurately measure normal and tangential forces between a finger and a surface. We fabricated sinusoidal surfaces with precisely controlled geometry, and measured spatial variations in resultant forces generated while subjects repeatedly scanned the surfaces at specified speed and pressure. Subsequent analysis revealed that the resulting force patterns varied in an organized way with spatial scale, and that fluctuations, in the form of aperiodic signal components that proved difficult to model, played an increasingly important role as the spatial scale of the surface geometry decreased. The results may help to explain differences in how surface detail is recovered by the haptic perceptual system at different length scales.

Marco Janko, Richard Primerano, Yon Visell
Introducing the Modifier Tactile Pattern for Vibrotactile Communication

We introduce the concept of “Modifier Tactile Pattern” as a pattern that modifies the interpretation of other elements that compose a Tacton or an entire tactile message. This concept was inspired by the prefixation strategies of the Braille system. We also show how to design tactile languages applying the concept of Modifier by following methodologies and approaches of Tacton design that already exist in literature. Then a modifier-based tactile language is designed and assessed in a user study.

Victor Adriel de J. Oliveira, Anderson Maciel
CARess, a Gentle Touch Informs the Driver

A prototype of Human Machine Interface (HMI) used to deliver information to the driver in cars is described in this paper. The delivery of information is based on the Informative Interruptive Cue (IIC) approach. The interface is a matrix of 4

$$\,\times \,$$

3 vibrating motors, controlled through a real-time algorithm based on apparent motion and phantom illusion to create continuous and discrete tactile patterns. A first experiment was conducted with 22 participants to examine their ability to discriminate the tactile patterns displayed by the interface placed on the back of a chair. Results showed 61.48 % successful recognition of tactile stimuli. A second experiment based on a free categorisation of the haptic stimuli was performed with another set of 20 participants. The goal was to understand the dimensions of the conceptual space chosen by the participants when telling tactile stimuli apart. Outcomes suggest that parameters such as speed, movement continuity and complexity are used for grouping.

Stefano Trento, Amalia de Götzen, Stefania Serafin
Text Entry Performance Evaluation of Haptic Soft QWERTY Keyboard on a Tablet Device

Recently touch screens are widely used for mobile devices to provide intuitive and natural interactions with fingertips. However, the lack of tactile feedback makes it difficult for users to receive key-click confirmation during text entry on soft keyboards. This paper examines the effect of tactile feedback on typing performance with the soft QWERTY keyboard: the most commonly used multi-finger text entry method on tablet devices. We implemented tactile feedback hardware and software to simulate the key-click effect on a commercially-available mobile tablet (Microsoft Surface Pro). We conducted a typing experiment to measure user performance and preference. The participants transcribed given phrases under three sensory feedback conditions: visual only, visual and audio, and visual and tactile. The results are unexpected; we did not find any significant difference in terms of typing performance, and user preference was as positive as the audio condition though better received than the visual only condition. This study thus reports different findings from previous work studying text entry on handheld devices, encouraging further examinations to fully understand the effect of tactile feedback on text entry in tablet devices.

Byung-Kil Han, Kwangtaek Kim, Koji Yatani, Hong Z. Tan
Morphing Tactile Display for Haptic Interaction in Vehicles

This work describes the design, fabrication and characterization of a morphing (actuated) tactile display based on an array of 32 electromagnetic actuators and a capacitive touch deformable layer. A two-stage locking device allows for reliable pattern creation when the users’ finger is placed on the interface. Taxel (tactile pixel) displacements up to 1.9 mm and a holding force of 1.25 N at 9 W electrical power have been measured. This device integrated in the centre console of vehicles is designed to work in combination with a LCD display mounted in the dashboard to control secondary vehicle systems.

Christian Bolzmacher, Gérard Chalubert, Olivier Brelaud, Jean-Philippe Alexander, Moustapha Hafez
Static Force Rendering Performance of Two Commercial Haptic Systems

The performance of a haptic device generally varies across its workspace. This work measures the force rendering accuracy and maximum force output rendering capabilities of two commercial devices – the Falcon and Omega devices – in six directions at 46 and 49 set locations, respectively, in the mechanical workspace. The results are compared to several theoretical measures of performance based on force ellipsoids. Findings quantify the differences in rendering capability across the tested locations, and show that this variability can be mostly explained by the highly nonlinear nature of the Jacobian.

Fabio Tatti, Netta Gurari, Gabriel Baud-Bovy
Adaptation to Motion Presented with a Tactile Array

We investigated the effects of adaptation to 2 min of tactile apparent motion along the proximo-distal axis of the finger pad, produced with a vibrotactile array (Optacon), and developed a novel method to reveal the tactile motion aftereffect. Participants continuously reported perceived direction during adaptation to motion in the distal or proximal direction. The clarity of the direction percept weakened over time. Following this adaptation phase, participants judged the direction of a dynamic test stimulus composed of simultaneous motion in both directions. A tactile motion aftereffect (tMAE) resulted - the test stimulus was felt to move in the direction opposite to the adapting motion. The tMAE was robust to changes in the stimulus including speed and spatial features of the moving pattern, but there was a general bias to perceive distal motion. The implication for tactile devices is that motion signals should be brief and varied to avoid adaptation artifacts.

Sarah McIntyre, Tatjana Seizova-Cajic, Ingvars Birznieks, Alex O. Holcombe, Richard M. Vickery
Design and Evaluation of a Peaucellier-Lipkin Linkage Based Haptic Interface

This paper presents the design, and characterization of a single degree of freedom haptic device based upon the Peaucellier-Lipkin mechanism, which is a perfect straight line motion. As compared to other single degree of freedom devices, the device features a larger workspace, and force output, while providing perfect straight line motion. Link lengths to minimize the size of the mechanism for the desired workspace size were determined following an optimization procedure. A prototype of the device has been built, and evaluated using bandwidth and apparent inertia measurements.

Sanditi Khandelwal, Manas Karandikar, Abhishek Gupta
Haptic Feedback Intensity Affects Touch Typing Performance on a Flat Keyboard

We study the effect of haptic feedback intensity on touch typing performance on a flat keyboard. In this study, we investigate how local and global haptic feedback intensities affect typing performance and examine if haptic feedback at a higher intensity brings more performance benefit. We also investigate if auditory feedback intensity affects typing performance. Participants are asked to type on a flat keyboard with given texts on a computer screen. We measure typing performance in terms of typing speed, efficiency and error rate at different intensity levels for both haptic and auditory feedback. The results show that the intensity of haptic feedback affects typing performance while the intensity of auditory feedback does not. Our findings suggest that haptic feedback is beneficial to typing performance and its intensity should be carefully chosen in designing a flat keyboard.

Jin Ryong Kim, Hong Z. Tan

Surface or Texture Perception and Display

Frontmatter
Time Course of Grouping by Proximity and Similarity in a Haptic Speeded Orientation Task

Behavioral and neurophysiological findings in vision suggest that grouping by proximity occurs earlier than grouping by similarity. The present study investigated in the haptic modality whether proximity is an earlier/faster grouping principle than texture similarity. In this study, we compared responses to stimuli grouped by proximity with that grouped by similarity (surface texture) using a speeded orientation detection task performed on a novel haptic device. The apparatus was interfaced with a computer to allow controlled stimulus presentation and accurate registration of the responses. Two were the main results of the experiment: (1) response times for stimulus patterns grouped by proximity were faster compared to those patterns grouped by similarity; and (2) in those patterns grouped by proximity, vertical symmetric patterns were classified faster than horizontal symmetric patterns. We conclude that the Gestalt principles of proximity and similarity apply to the haptic modality. As in vision, grouping by proximity is faster than grouping by similarity, especially when symmetric grouped patterns are oriented vertically in line with the body midline axis.

Antonio Prieto, Julia Mayas, Soledad Ballesteros
Unequal but Fair? Weights in the Serial Integration of Haptic Texture Information

The sense of touch is characterized by its sequential nature. In texture perception, enhanced spatio-temporal extension of exploration leads to better discrimination performance due to combination of repetitive information. We have previously shown that the gains from additional exploration are smaller than the Maximum Likelihood Estimation (MLE) model of an ideal observer would assume. Here we test if this suboptimal integration can be explained by unequal weighting of information. Participants stroke 2 to 5 times across a virtual grating and judged the ridge period in a 2IFC task. We presented slightly discrepant period information in one of the strokes in the standard grating. Results show linearly decreasing weights of this information with spatio-temporal distance (number of intervening strokes) to the comparison grating. For each exploration extension (number of strokes) the stroke with the highest number of intervening strokes to the comparison was completely disregarded. The results are consistent with the notion that memory limitations are responsible for the unequal weights. This study raises the question if models of optimal integration should include memory decay as an additional source of variance and thus not expect equal weights.

Alexandra Lezkan, Knut Drewing
The Influence of Material Cues on Early Grasping Force

The object of this study was to see whether differences in texture influence grip force in the very early phase of grasping an object. Subjects were asked to pick up objects with different textures either blindfolded or sighted, while grip force was measured. Maximum force was found to be adjusted to suit the differences in coefficient of friction, confirming earlier results. Surprisingly, statistically significant differences in grip force were already present as short as 10 ms after touch onset in the blindfolded condition, despite the fact that only haptic information about the texture was available. This suggests that the haptic system is very fast in identifying a texture’s friction.

Wouter M. Bergmann Tiest, Astrid M. L. Kappers
A Robust Real-Time 3D Tracking Approach for Assisted Object Grasping

Robotic exoskeletons are being increasingly and successfully used in neuro-rehabilitation therapy scenarios. Indeed, they allow patients to perform movements requiring more complex inter-joint coordination and gravity counterbalancing, including assisted object grasping. We propose a robust RGB-D camera-based approach for automated tracking of both still and moving objects that can be used for assisting the reaching/grasping tasks in the aforementioned scenarios. The proposed approach allows to work with non pre-processed objects, giving the possibility to propose a flexible therapy. Moreover, our system is specialized to estimate the pose of cylinder-like shaped objects to allow cylinder grasps with the help of a robotic hand orthosis.

To validate our method both in terms of tracking and of reaching/grasping performances, we present the results achieved conducting tests both on simulations and on real robotic-assisted tasks performed by a patient.

Claudio Loconsole, Fabio Stroppa, Vitoantonio Bevilacqua, Antonio Frisoli
Diminished Haptics: Towards Digital Transformation of Real World Textures

In this study, we develop and implement a method for transforming real-world textures. By applying a squeeze film effect to real-world textures, we make haptic textures reduced. This method could transform real-world textures, e.g., from paper-like to metal-like, from wood-like to paper-like, and so on. The textures provided by this system are inherently high resolution because real-world textures are used instead of synthesized data. We implemented a system using a 28-kHz transducer. Evaluations were conducted using a three-axis accelerometer.

Yoichi Ochiai, Takayuki Hoshi, Jun Rekimoto, Masaya Takasaki
A Data-Driven Approach to Remote Tactile Interaction: From a BioTac Sensor to any Fingertip Cutaneous Device

This paper presents a novel approach to remote tactile interaction, wherein a human uses a telerobot to touch a remote environment. The proposed system consists of a BioTac tactile sensor, in charge of registering contact deformations, and a custom cutaneous device, in charge of applying those deformations to the user’s fingertip via a 3-DoF mobile platform. We employ a novel data-driven algorithm to directly map the BioTac’s sensed stimuli to input commands for the cutaneous device’s motors, without using any kind of skin deformation model. We validated the proposed approach by carrying out a remote tactile interaction experiment. Although this work employed a specific cutaneous device, the experimental protocol and algorithm are valid for any similar display.

Claudio Pacchierotti, Domenico Prattichizzo, Katherine J. Kuchenbecker
Lateral Skin Stretch Influences Direction Judgments of Motion Across the Skin

Background:

Sliding of surfaces across the skin changes the position of edges and texture elements relative to the receptive fields of somatosensory neurons. This ‘successive positions’ cue is sufficient to elicit motion sensation. Surfaces also create lateral skin stretch due to friction and we ask whether this potential cue influences perceived direction.

Method:

A 4-pin array was applied to the forearm to manipulate the two motion cues independently. It indicated distal or proximal direction by pin activation in the order 1-2-3-4 or 4-3-2-1. Crucially, each pin also stretched the skin by moving 3.5 mm laterally in the same (Congruent cues) or opposite direction (Incongruent cues).

Results:

90 % of motion direction judgments accurately reflected succession of positions with a Congruent skin stretch cue but only 79 % with an Incongruent skin stretch cue (F

1,7

= 6.80, p = .035).

Conclusion:

The skin stretch cue contributes to neural coding of motion direction.

Tatjana Seizova-Cajic, Kornelia Karlsson, Sara Bergstrom, Sarah McIntyre, Ingvars Birznieks
On the Effect of Vibration on Slip Perception During Bare Finger Contact

This study investigated the influence of the presence and timing of cutaneous vibration cues supplied to the finger pad on the perception of slip of a contact surface slid beneath it. We designed an apparatus that made it possible to supply precisely controlled shear force, sliding displacement and vibration cues to the finger pad via a moving surface. We conducted an experiment to assess the effect, if any, of the presence and timing of vibrotactile feedback presentation relative to slip onset on the perceived duration of slipping between the finger and the sliding surface. We found that vibrotactile stimuli that are presented at slip onset or during the slip phase both increased the perceived duration of slipping. In contrast, if the same cues are presented during the stick phase, they tended to decrease perceived slip duration. These results support a perceptual role for cutaneous vibrations felt in slip estimation, and indicate an opposite perceptual interpretation depending on their timing relative to slip onset.

Hikaru Nagano, Yon Visell, Shogo Okamoto
Presentation of Surface Undulation to Hand by Several Discrete Stimuli

Humans cannot perceive wide and small surface undulation, which is hundreds micrometers in height and hundreds millimeters in width, with their fingertips, whereas they can perceive it by scanning the surface with their whole fingers including the palm to the direction of a long side of the finger. We develop a haptic device that presents surface undulation to the hand, considering the mechanical interaction between the human’s hand and the surface undulation. It is composed of nine independent stimulator units that control heights of nine finger pads of the index finger, the middle finger, and the ring finger (3 on each finger) according to the virtual surface. Fundamental experiment of perceived surface estimation shows the availability of the haptic display.

Yoshihiro Tanaka, Yuki Goto, Akihito Sano
Roughness Perception of Micro-particulate Plate: A Study on Two-Size-Mixed Stimuli

Tactile roughness perception of fine-texture has been investigated by using non-uniform surface materials (e.g., polishing papers), and therefore it was difficult to quantitatively discuss the relationship between physical surface property and perceived roughness. To solve this issue, we made a surface plate that has close-packed structure with micro-particles of single size (uniform surface), and a surface plate that has pseudo-close-packed structure with micro-particles of two sizes (mixed surface). In this paper, we estimated subjective equality of perceived roughness of the mixed surfaces in relative to the uniform surfaces, and compared the physical properties of the mixed and uniform surfaces that had perceptually equal roughness. We found that perceived roughness cannot be explained only by the physical distance between the micro-particles.

Hiroki Tsuboi, Makoto Inoue, Scinob Kuroki, Hiromi Mochiyama, Junji Watanabe
Design of a Haptic Magnifier Using an Ultrasonic Motor

The paper presents a serial architecture of an actuated manipulator which uses an ultrasonic motor. The serial architecture allows to modify the kinetic relationship between user’s input and a tool. The design of the device is presented. A load, which exhibits fine details, is used in order to show how a zooming effect of its haptic rendering can be achieved with the haptic magnifier. Finally, the design is validated through an experimental analysis.

Frédéric Giraud, Michel Amberg, Christophe Giraud-Audine, Betty Lemaire-Semail
Classification of Texture and Frictional Condition at Initial Contact by Tactile Afferent Responses

Adjustments to friction are crucial for precision object handling in both humans and robotic manipulators. The aim of this work was to investigate the ability of machine learning to disentangle concurrent stimulus parameters, such as normal force ramp rate, texture and friction, from the responses of tactile afferents at the point of initial contact with the human finger pad. Three textured stimulation surfaces were tested under two frictional conditions each, with a 4 N normal force applied at three ramp rates. During stimulation, the responses of fourteen afferents (5 SA-I, 2 SA-II, 5 FA-I, 2 FA-II) were recorded. A Parzen window classifier was used to classify ramp rate, texture and frictional condition using spike count, first spike latency or peak frequency from each afferent. This is the first study to show that ramp rate, texture and frictional condition could be classified concurrently with over 90 % accuracy using a small number of tactile sensory units.

Heba Khamis, Stephen J. Redmond, Vaughan Macefield, Ingvars Birznieks
Exploring the Role of Dynamic Audio-Haptic Coupling in Musical Gestures on Simulated Instruments

Recent developments at ACROE-ICA have yielded a first audio-haptic modeller-simulator for real time simulations of mass-interaction physical models. This paper discusses the conceptual and technological considerations for a haptic platform that supports dynamic coupling for audio-haptic interactions, in particular in the case of physical coupling with virtual musical instruments. We present work concerning system calibration and metrology, then demonstrate a model developed with this platform, exhibiting dynamic gesture-sound coupling. The quality of the haptic device’s dynamics along with the modularity of mass interaction physical modelling allow for subtle and highly precise interaction with simulated objects, enabling intricate musical gestures relying on high-bandwidth audio-haptic coupling.

James Leonard, Jean-Loup Florens, Claude Cadoz, Nicolas Castagné
A Device and Method for Multimodal Haptic Rendering of Volumetric Stiffness

Vibrotactile signals are produced during haptic exploration of compressible objects through a variety of contact and bulk mechanical processes. Prior studies have found that vibrotactile feedback can influence stiffness perception, but the reason for this is unknown. Here, we investigated the role of vibration in stiffness perception during object squeezing. We propose a physically motivated explanatory model and rendering algorithm relating vibrotactile and force-displacement cues, then present a novel haptic interface that was designed to accurately reproduce these cues. Finally, we present the results of an experiment on the perceptual integration of vibrotactile and force-displacement cues during one- and two-finger stiffness perception. The results indicate that vibration feedback can increase perceived object softness during interaction with one finger, but preclude a similar conclusion for two-finger grasping. We argue that the results support the proposed model once innate differences in one- and two-finger exploration are accounted for.

Yon Visell, Keerthi Adithya Duraikkannan, Vincent Hayward
A New Surface Display for 3D Haptic Rendering

This paper presents a new surface display that enables realistic 3D haptic rendering with more plentiful haptic information. The visual feedback that is commonly provided from a touch surface includes depth and texture information. To render this information haptically, both kinesthetic feedback and tactile feedback are needed. Although electrovibration or mechanical vibration is currently used to simulate these types of feedback, it is not easy to render such complicated information by using only one of the two feedbacks. Also, in the case of electrovibration, active tactile feedback is not available because it can be conveyed only if the fingertip slides over the touch surface. To address these issues, we propose a surface display that can generate both electrovibration and mechanical vibration uniformly on a large surface. We describe the principle of operation and vibration characteristics of the system. A psychophysical experiment was then conducted to evaluate the effects of the mechanical vibration to the perceived friction force caused by electrovibration. Finally, we present the possibility of this approach as an effective method for 3D haptic rendering through several example applications.

Dongbum Pyo, Semin Ryu, Seung-Chan Kim, Dong-Soo Kwon
Data-Driven Texture Rendering with Electrostatic Attraction

In this paper, we present a data-driven haptic rendering method applied to a tactile display based on electrostatic attraction force. For realistic virtual textures, surface data from three materials is collected using an accelerometer and then replayed on the electrostatically-actuated tactile display. The proposed data-driven texture rendering method was compared with the standard square wave excitation through a psychophysical experiment. Subjects rated similarities between real samples and virtual textures rendered by both methods. Results show that the virtual textures generated with the data-driven method had significantly higher percentage of similarity with the real textures in comparison to the square wave signal. In addition, the proposed method resulted in higher number of correct matches between virtual models and real materials.

Gholamreza Ilkhani, Mohammad Aziziaghdam, Evren Samur
Design of a Miniature Integrated Haptic Device for Cutaneous, Thermal and Kinaesthetic Sensations

This paper presents the design of a new miniature integrated haptic device consisting of a cutaneous, a thermal and a kinaesthetic module. To convey realistic tactile sensations in compact size, the cutaneous module is designed to include a 3 × 3 miniature pin-array display and an amplitude-modulated vibration generator. The pin-array display is intended to mimic both small-scale shapes and roughness of a virtual surface, and the vibration generator creates a variety of amplitude-modulated vibrations to replicate the continuous texture of a virtual environment. The pin-array module is constructed by miniature magnetic linear actuators, and the vibration generator is fabricated using a stacked multilayer piezo-actuator, flexure structures and supporting frames. To further convey the temperature and thermophysical properties of an object, the thermal module including a peltier element and a cooling jacket is integrated to the cutaneous module. In addition to tactile information, the kinesthetic information is equally important while perceiving objects. Thus, two kinaesthetic actuators operated by MR fluids are located on the control circuit board while supporting the cutaneous and thermal modules. The proposed design is sufficiently small to be installed into human-computer interface, and the integrated haptic device is able to enhance the sense of reality with combined haptic feedbacks while users interacts with graphical environments.

Tae-Heon Yang, Yu-Joon Kim, Yon-Kyu Park, Sang-Youn Kim
Virtual Roughness Perception Using Coil Array Magnetic Levitation Haptic Interface: Effects of Torque Feedback

We have recently developed a compact, lightweight platform that can be worn on the finger of the user, so that haptic feedback can be generated directly on the fingertip pad rather than through a grasped instrument handle. A planar array of cylindrical coils and a 3D rigid-body motion tracker generate forces and torques on the magnet in the levitated platform worn by the user. This study compares perception of sinusoidal surface roughness across three different renderings: (1) with no torque feedback, (2) with torque feedback to follow the local surface slope, and (3) with torque feedback to prevent the fingertip from roll and pitch rotations. The results show that perceived roughness decreases with wavelength in all three conditions, but in addition, and independently, there are strong effects of torque feedback.

Sahba Aghajani Pedram, Roberta Klatzky, Oran Isaac-Lowry, Peter Berkelman
Recording Device for Natural Haptic Textures Felt with the Bare Fingertip

The perception of haptic textures depends on the mechanical interaction between a surface and a biological sensor. A texture is apprehended by sliding one’s fingers over the surface of an object. We describe here an apparatus that makes it possible to record the mechanical fluctuations arising from the friction between a human fingertip and easily interchangeable samples. Using this apparatus, human participants tactually scanned material samples. The analysis of the results indicates that the biomechanical characteristics of individual fingertips clearly affected the mechanical fluctuations. Nevertheless, the signals generated for a single material sample under different conditions showed some invariant features. We propose that this apparatus can be a valuable tool for the analysis of natural haptic surfaces.

Jonathan Platkiewicz, Alessandro Mansutti, Monica Bordegoni, Vincent Hayward
Virtual Surface Textures Created by MEMS Tactile Display

This paper reports creation of virtual surface textures using a MEMS tactile display. The display consists of large-displacement MEMS actuators with hydraulic amplification mechanisms. By controlling the displacement, the vibration frequency, and actuator driving patterns, the display could generate tactile feeling of various surface textures to a fingertip. In order to investigate the produced virtual tactile sensation quantifically, we prepared 18 samples, such as wood, urethane foam, and sandpapers, to compare. The subjects were requested to select one of these samples that had a texture most similar to the one produced by the display. We categorized the samples with respect to roughness, softness, surface energy and warmness, and correlated the control parameters to the selected samples. We also used information amount to analyze the results with respect to shareness among the subjects. We experimentally found that displacement of actuators had strong correlation with the roughness and the display presented hard surfaces except when apparent motion was generated.

Yumi Kosemura, Junpei Watanabe, Hiroaki Ishikawa, Norihisa Miki
Toward Active Boundary Conditions for Variable Friction Touchscreens

Variable friction touchscreens rely on ultrasonic plate vibrations to generate tactile effects on the surface. We propose using active boundary conditions (dynamically changing plate boundary conditions) for generating haptic perceptions from variable friction touchscreens. This method has several advantages over current designs including enabling greater control over the plate’s mode shape and the potential to generate new tactile effects on the plate surface. To explore the feasibility of this novel technique, we developed a plate model in ANSYS and conducted a series of simulations investigating three boundary condition parameters: location of the constraint, length of the constraint, and the “stiffness” of the constraint. We conclude that a variable stiffness boundary condition offers great potential for enabling systematic control of mode shapes and opens up possibilities of generating new tactile effects from these surfaces.

Hoechel Lee, Karl Katumu, Jenna Gorlewicz
Tactile Display to Represent Stiffness Distribution of Human Tissue Using Magnetorheological Fluid

This paper demonstrates a tactile display to represent stiffness distribution using magnetorheological fluid. Intravital palpation is effective to detect small tumors that CT scan or endoscopes cannot detect. In order to perform intravital palpation, we propose a tactile display to reproduce stiffness of human tissue. We focused on magnetorheological fluid that changes characteristics by external magnetic field. At first, we investigated mechanical properties of the display by compression tests. Next, we compared mechanical properties with human tactile sensation. From experimental results, the display was able to represent tissue of tumor in the size 5 mm or less.

Hiroki Ishizuka, Nicolo Rorenzoni, Norihisa Miki
Microfabricated Needle-Arrays for Stimulation of Tactile Receptors

This paper describes electro tactile display with micro-needle electrodes. The electro tactile display can display the tactile sensations by stimulating tactile receptors with electric currents. Micro-needle electrodes can drastically decrease the electrical impedance because the needles can penetrate through the stratum corneum which has higher impedance than dermis. In addition, adjusted length of the needle allows us to stimulate the tactile receptors with no pain. The shape of micro-needle electrodes of an electro tactile display affects the tactile sensation. When the tip radius of the needle is too small, the impedance between finger and micro-needles becomes large due to the small contact area. On the other hand, when the needle tip has a large radius, the needle cannot go through the surface of fingers and the impedance does not decrease enough. In this work, we developed fabrication processes of a micro-needle array and an electrotactile display using the micro needle electrode array. We experimentally confirmed the superiority of needle electrode devices to flat devices and deduced the optimum shape of the micro-needle electrodes for electro tactile display applications.

Norihide Kitamura, Julien Chim, Norihisa Miki
Backmatter
Metadaten
Titel
Haptics: Neuroscience, Devices, Modeling, and Applications
herausgegeben von
Malika Auvray
Christian Duriez
Copyright-Jahr
2014
Verlag
Springer Berlin Heidelberg
Electronic ISBN
978-3-662-44193-0
Print ISBN
978-3-662-44192-3
DOI
https://doi.org/10.1007/978-3-662-44193-0

Neuer Inhalt