Skip to main content

2025 | Buch

Haptics: Understanding Touch; Technology and Systems; Applications and Interaction

14th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2024, Lille, France, June 30 – July 3, 2024, Proceedings, Part I

herausgegeben von: Hiroyuki Kajimoto, Pedro Lopes, Claudio Pacchierotti, Cagatay Basdogan, Monica Gori, Betty Lemaire-Semail, Maud Marchal

Verlag: Springer Nature Switzerland

Buchreihe : Lecture Notes in Computer Science

insite
SUCHEN

Über dieses Buch

The two-volume set LNCS 14768 + 14769 constitutes the refereed proceedings of the 14th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2024, held in Lille, France, during June 30 – July 3, 2024.

The 81 full papers presented were carefully reviewed and selected from 142 submissions. They were organized in topical sections as follows: understanding touch; technology and systems; applications and interaction.

Inhaltsverzeichnis

Frontmatter

Understanding Touch

Frontmatter
Human Identification Performance of Vibrotactile Stimuli Applied on the Torso Along Azimuth or Elevation

This study investigates the human recognition performance of vibrotactile stimuli applied on the torso along azimuth or elevation under the context of tactile egocentric directional cueing. We conducted two absolute identification experiments for azimuth and elevation, respectively, using real tactile stimuli generated by physical actuators and illusory stimuli rendered by funneling illusion. For both azimuth and elevation, the recognition accuracies were very high, over 95.6%, when only real stimuli were used to indicate 6–8 directions. However, combining the real stimuli with the illusory ones to double the spatial resolution resulted in significantly lower accuracies between 74.1% and 77.8%. The estimated information transfer (IT) values also remained very similar. Using identical methods, our results quantify the human identification performance of torso-distributed tactile stimuli for azimuth or elevation. The detailed results provide general guidelines for designing torso-based tactile systems to enhance spatial awareness and navigation.

Junwoo Kim, Jaejun Park, Chaeyong Park, Junseok Park, Seungmoon Choi
Utilizing Absence of Pacinian Corpuscles in the Forehead for Amplitude-Modulated Tactile Presentation

This study proposes a method for tactile presentation to the forehead, focusing on the low-frequency band with a small linear resonant actuator. We utilized the absence of the Pacinian corpuscles, responsible for sensing vibrations around 200 Hz in the forehead, and amplitude modulation to compensate for the challenge of the limited frequency range of linear resonant actuators. The amplitude modulation was achieved by around 200-Hz carrier wave around the actuator’s resonant frequency. In two experiments, we investigated the efficacy of amplitude modulation in representing low-frequency vibrations (Experiment 1) and the quality of vibration (Experiment 2) on the forehead. We found that participants could clearly feel the original low-frequency vibration on the forehead compared to other body locations.

Yuma Akiba, Shota Nakayama, Keigo Ushiyama, Izumi Mizoguchi, Hiroyuki Kajimoto
Optimizing Haptic Feedback in Virtual Reality: The Role of Vibration and Tangential Forces in Enhancing Grasp Response and Weight Perception

This study explored haptic feedback methods that do not rely on tangential forces, aiming to address the miniaturization challenges of haptic feedback devices in current virtual reality environments. We employed a six-degree-of-freedom haptic interface to construct a virtual grasping scene that simulates the stick-slip phenomenon. By comparing user grip response speed and force adjustment capability under different conditions: no haptic feedback, vibration feedback, and vibration with tangential force feedback, we collected data on users’ grip responses under different haptic conditions. The results demonstrate that vibration feedback can significantly improve users’ grip response speed without tangential forces. However, without tangential forces, users find it hard to differentiate responses based on the weight of objects. This indicates that while vibration feedback can simplify the design of haptic devices and enhance response speed, tangential forces are still essential for accurate weight perception. The theoretical and practical significance of this research lies in providing a new direction for haptic feedback devices in virtual reality.

Yunxiu Xu, Siyu Wang, Shoichi Hasegawa
Audiovisual-Haptic Simultaneity Perception Across the Body for Multisensory Applications

This paper explores human performance on simultaneity judgments between audiovisual and haptic stimuli across the body for use in future real-time haptic applications. Three representative body sites, the torso, fingertip, and foot, were stimulated with vibration, and a media clip was used as an audiovisual stimulus. The results showed that: (1) the timing delay that humans can tolerate in the audiovisual leading-haptic following case was 55 ms at the fingertip, 65 ms at the chest, and 45 ms at the foot, and these values were significantly different from each other, (2) the regression curves shifted toward the haptic-leading direction from the chest down to the foot, showing significantly different points of subjective synchrony (PSS) between the body sites but similar window widths of simultaneity, (3) the PSSs were obtained between 20 and 40 ms in the haptic leading-audiovisual following case, and (4) a significant asymmetry was observed in the curves between haptic-leading and audiovisual-leading stimuli, with a higher temporal sensitivity in the audiovisual-leading case. We expect our results can provide essential information for multisensory applications.

Jiwan Lee, Gyeore Yun, Seungmoon Choi
Apparent Thermal Motion on the Forearm

The concept of Apparent Tactile Motion (ATM) has been extensively studied in the field of haptics, allowing people to perceive a sense of dynamic motion through tactile stimuli such as vibrations, tapping or mid-air stimuli. However, there is a lack of research on whether a similar perception of motion can be achieved using thermal stimuli. As prior research suggests that particularly the stimuli onset asynchrony (SOA) of two stimuli is a significant contributor to the perception of motion, in this study, we examine different SOAs between two warm stimuli on the forearm in order to induce a sensation of motion. Our results indicate that the sensation of motion can be achieved on the forearm with SOAs close to the signal duration. We further found a negative correlation between SOAs and the perception of speed and report findings of participants’ perceptions of motion through drawings. With our study, we strengthen the understanding of dynamic thermal feedback through apparent thermal motion that may lead to the development of lighter and more sustainable wearable thermal devices.

Tim Moesgen, Hsin-Ni Ho, Yu Xiao
Surface Tactile Presentation to the Palm Using an Aerial Ultrasound Tactile Display

In tactile interaction in a virtual space, tactile presentation to the palm over a wider plane is required. A highly uniform tactile stimulus is required to present a stationary flat surface on the palm, and it is necessary to identify the parameters of tactile presentation that improve uniformity. Generating ultrasound pressure distribution over a surface area improves the uniformity of tactile stimulus, whereas the perceived intensity decreases as the presentation area expands because the sound pressure is dispersed. In this study, we aimed to present a stronger surface stimulus by using spatiotemporal modulation (STM) to rapidly move the ultrasound focus within the presentation area and thus distribute the concentrated sound pressure over the entire area. We showed that tactile presentation of a square area as a simple shape can be achieved using STM by adjusting the focal path and the number of path passes per unit of time, thereby improving the perceived intensity while maintaining spatial uniformity and temporal constancy compared to acoustic field control.

Naoki Kishi, Atsushi Matsubayashi, Yasutoshi Makino, Hiroyuki Shinoda
Humans Terminate Their Haptic Explorations According to an Interplay of Task Demands and Motor Effort

Haptic exploration is an inherently active process by which humans gather sensory information through physical contact with objects. It has been proposed that humans generally optimize their exploration behavior to improve perception. We hypothesized that the duration of haptic explorations is the result of an optimal interplay of sensory and predictive processes, also taking costs such as motor effort into account. We assessed exploration duration and task performance in a two-alternative forced-choice spatial frequency discrimination task under varying conditions of task demand and motor effort. We manipulated task demands by varying the discriminability of virtual grating stimuli and manipulated motor effort by implementing forces counteracting the participants’ movements while switching between stimuli. Participants were instructed to switch between stimuli after each swipe movement. Results revealed that higher task demands lead to higher numbers of exploratory movements (i.e. longer exploration duration), likely reflecting a compensatory mechanism that enables participants to attain a certain level of task performance. However, this effect is reduced when motor effort is increased; while low and medium task demands yield similar numbers of movements regardless of related motor effort, higher demands are not associated with increased numbers of movements when the required motor effort is high. In conclusion, the extent to which increased task demands are compensated via the extension of an exploration seems to depend on the motor costs that the agent is confronted with.

Michaela Jeschke, Anna Metzger, Knut Drewing
The TIP Benchmark: A Tactile Image-Based Psychophysics-Inspired Benchmark for Artificial Tactile Sensors

We introduce a comprehensive benchmarking method, the TIP benchmark, to assess the spatial acuity of tactile sensors. The TIP benchmark is made up of 4 stages, in which data output from a tactile sensor performing a psychophysics-inspired task is converted into images. The Structural Similarity Index Measure (SSIM) and support vector machine (SVM) classifier are then used to evaluate sensor performance across 4 metrics, representing sensor accuracy (Accuracy, Distance), stability (IQR) and generalisability (Margin). The TIP benchmark is validated to determine an ideal indentation depth and evaluate noise degradation on a tactile task, and is then employed for a grid search hardware optimization of a neuromorphic tactile sensor (9 configurations, 2 hardware design parameters). Sensors with shorter, denser internal pins are shown as having greater spatial acuity on a grating orientation discrimination task, demonstrating the TIP benchmark’s potential to quantitatively compare tactile sensors, paving the way for establishing unified methods for hardware design and benchmarking for real-world applications.

Tianyi Liu, Benjamin Ward-Cherrier
Towards Intensifying Perceived Pressure in Midair Haptics: Comparing Perceived Pressure Intensity and Skin Displacement Between LM and AM Stimuli

Ultrasound Midair Haptics (UMH) can present various noncontact tactile patterns by focusing ultrasound on human skin. With UMH, a steady pressure sensation can be presented by periodically shifting a stimulus point (ultrasound focus) at several hertz. Such stimulus with a periodic focal shift is called Lateral Modulation (LM). The perceived intensity of this pressure sensation was several times stronger than the applied radiation force (e.g., 0.22 N for 27 mN of radiation force). Further intensifying the pressure sensation by LM expands the range of reproducible tactile sensations such as a hard object; however, a stimulus design guideline for the intensification has not been established because the perception mechanism of the LM-evoked pressure sensation is still unclear. Towards intensifying the pressure sensations in UMH, this study investigates the effects of the main frequency components of skin vibrations produced by LM and that of the amplitude on the perceived pressure intensity. We first confirmed that the perceived pressure intensity of LM 5 Hz was stronger than that of 5 Hz amplitude modulation (AM). AM is a simple vibration with a fixed stimulus position. We also measured the 5 Hz vibration amplitude of the skin during stimulation and confirmed no significant difference in the amplitude between LM and AM. The results showed that a 5 Hz skin vibration and the amplitude alone cannot explain the perceived intensity of the pressure sensation by LM. These results indicate that other factors in LM such as focal shifts would be necessary to present stronger pressure sensations.

Tao Morisaki, Yusuke Ujitoko
Virtual Hand Illusion Induced by Suction Pressure Stimulation to the Face

The body ownership toward avatars in virtual environments is a key factor for enhancing the quality of virtual reality experiences. Despite previous beliefs that spatiotemporal congruence of visuo-motor and visuo-tactile stimulation was essential for inducing body ownership, this paper shows that body ownership can occur with spatial incongruence of visuo-tactile stimulation, specifically when tactile stimuli are applied to the face. We developed a suction pressure stimulation display system integrated with a head-mounted display in an immersive virtual environment, where tactile feedback corresponded to the interaction between a virtual hand and virtual object. Three tactile stimulation conditions were compared: no tactile feedback, stimulation to the fingertips, and stimulation to the face. In an experiment involving ten participants, we observed that body ownership was induced with tactile feedback provided to the fingertips or face. These results indicate that body ownership can be induced even under a large spatial incongruence between visuo-tactile stimulation, and provide insights for designing tactile displays that present tactile stimuli to different body parts.

Takayuki Kameoka, Taku Hachisu, Hiroyuki Kajimoto
Task-Adapted Single-Finger Explorations of Complex Objects

The perception of material/object properties plays a fundamental role in our daily lives. Previous research has shown that individuals use distinct and consistent patterns of hand movements, known as exploratory procedures (EPs), to extract perceptual information relevant to specific material/object properties. Here, we investigated the variation in EP usage across different tasks involving objects that varied in task-relevant properties (shape or deformability) as well as in task-irrelevant properties (deformability or texture). Participants explored 1 reference object and 2 test objects with a single finger before selecting the test object that was most similar to the reference. We recorded their finger movements during explorations, and these movements were then categorised into different EPs. Our results show strong task-dependent usage of EPs, even when exploration was confined to a single finger. Furthermore, within a given task, EPs varied as a function of material/object properties unrelated to the primary task. These variations suggest that individuals flexibly adapt their exploration strategies to obtain consistent and relevant information.

Lisa Pui Yee Lin, Alina Böhm, Boris Belousov, Alap Kshirsagar, Tim Schneider, Jan Peters, Katja Doerschner, Knut Drewing
Exploring Frequency Modulation in Decoding Edge Perception Through Touch

This research explores how different vibration frequencies influence our ability to perceive and identify edges through touch. The study used a PinArray device to simulate various edge shapes, testing if participants could recognize these shapes based on pairing two frequencies (LF-HF). The findings showed that certain frequency combinations in low and medium ranges were particularly effective in conveying step edge and ridge shapes. This has important implications for enhancing tactile technology in fields like robotics and assistive devices by improving the realism of tactile virtual shapes.

Mounia Ziat, Iliyas Tursynbek, Thu Pham, Allison Ling
The Role of Implicit Prior Information in Haptic Perception of Softness

People regularly use active touch to perform daily life tasks. Imagine choosing a comfortable pillow and how you would explore its softness. It is known that people tune their exploratory behavior to get the most relevant information. In the exploration process, also prior information is used, which is available before we touch an object. For softness perception, object indentation plays a crucial role; indentation forces were higher, when people implicitly expected to explore harder as compared to softer objects. This force-tuning improved perception, and was observed when trials of the same softness level (hard or soft) were presented in longer blocks. However, it was not reported for predictable patterns in that hard and soft stimuli alternate in every or every other trial. Here, we investigated when and how implicit prior information about the softness level becomes accessible for successful force-tuning in softness discrimination. Participants were presented with hard and soft stimulus pairs in sequences of the length of 2, 4 or 6 trials. In predictable conditions, same-length sequences of hard and soft trials alternated constantly. In unpredictable conditions, we presented sequences of lengths 2, 4 and 6 randomly. We analyzed initial peak indentation forces. Participants applied higher forces to harder stimuli in the predictable condition in longer sequences (4 and 6) as compared to the unpredictable condition and shorter sequences of 2. We interpret the findings in terms of an anticipatory and incremental mechanism of force-tuning, which needs to be triggered by an initial predictable stimulus.

Didem Katircilar, Knut Drewing
Discovering the Causal Structure of Haptic Material Perception

The sensory signals that occur when we touch or interact with objects carry the information necessary to perceive and reason about object properties. Research on material perception has provided evidence that humans can categorize materials and assess their similarity based solely on haptic information. This evidence is based on the performance on classification tasks and correlation analyses, which, by definition, provide no information on the causes of the observed behavior. This paper explores the use of causal discovery methods to analyze human haptic perception of material categories. Causal discovery algorithms analyze statistical patterns in the data, such as conditional (in)dependence relationships, and then determine causal relationships between the variables that are compatible with these patterns. The result is a set of causal graphs with nodes representing the variables and directed edges representing empirically plausible causal relationships. In this paper, a causal discovery algorithm is used to analyze material category ratings and vibratory signals from haptic exploration. The goal is to understand the underlying cause-effect structure linking material samples, vibration signals, and category similarity ratings. The identified causal structure indicates that the information represented by the slope of the vibratory signal plays a key role in rating a material’s similarity to different categories, but in parts, it is only an indirect cause. The practical use of causal discovery methods for analyzing haptic perception data is demonstrated.

Jaime Maldonado, Christoph Zetzsche, Vanessa Didelez
The Visual and Haptic Contributions to Hand and Foot Representation

Research has shown systematic distortions in hand representation. It has been argued that these distortions reflect the somatosensory homunculus, and it is through vision that the distortions are corrected. However, in those previous studies the haptic and visual tasks used were considerably different from one another. Therefore, in the current study, we devised a task that had identical requirements for each sensory condition. To be specific, the participants haptically and visually deciphered if various sized gloves and shoes were bigger than their hands and feet. We found that participants overestimated their hand size significantly more in the haptic condition, while the feet were estimated similarly between conditions. Moreover, hand distortions in the haptic condition were significantly larger than feet distortions. Lastly, we also found sex differences, as females overestimated both hand and feet size significantly more than males. Taken together, our results support the suggestion that distortions in haptic body representation tasks are more somatotopic.

Lara A. Coelho, Anna Vitale, Carolina Tammurello, Claudio Campus, Claudia L. R. Gonzalez, Monica Gori
How Visualizing Touch Can Transform Perceptions of Intensity, Realism, and Emotion?

Social touch is a common method of communication between individuals, but touch cues alone provide only a glimpse of the entire interaction. Visual and auditory cues are also present in these interactions, and increase the expressiveness and recognition of the conveyed information. However, most mediated touch interactions have focused on providing only haptic cues to the user. Our research addresses this gap by adding visual cues to a mediated social touch interaction through an array of LEDs attached to a wearable device. This device consists of an array of voice-coil actuators that present normal force to the user’s forearm to recreate the sensation of social touch gestures. We conducted a human subject study (N = 20) to determine the relative importance of the touch and visual cues. Our results demonstrate that visual cues, particularly color and pattern, significantly enhance perceived realism, as well as alter perceived touch intensity, valence, and dominance of the mediated social touch. These results illustrate the importance of closely integrating multisensory cues to create more expressive and realistic virtual interactions.

Xin Zhu, Zhenghui Su, Jonathan Gratch, Heather Culbertson

Technology and Systems

Frontmatter
Asymmetric Hit-stop for Multi-user Virtual Reality Applications: Reducing Discomfort with the Movement of Others by Making Hit-stop Invisible

A hit-stop is an expression technique primarily used in fighting games that emphasizes the response to the impact felt by a player when the target’s movement is momentarily delayed at the moment of impact. It has been reported that the hit-stop technique can make users feel pseudo-impact forces when applied to virtual reality (VR). However, existing research has focused on a single user, and the effects of hit-stops when used by multiple users in the same virtual environment in a competitive game-like situation have not been verified. There is concern that users may feel discomfort, and the quality of the VR experience may be degraded by observing the difficult-to-predict movements of other players due to hit-stop. Therefore, this study examines the degree to which hit-stops in a multi-user VR cause discomfort to users and investigates the extent to which such discomfort can be mitigated by presenting hit-stops asymmetrically. We conducted an experiment in which two players played a tennis rally in a virtual environment to see whether they felt discomfort in the movements of the other player and themselves under four conditions: a combination of showing the player’s hit-stop compensated movement (symmetric condition) or their real movements without hit-stop correction (asymmetric condition) to their opponent, and showing the opponent’s hit-stop compensated movement or the opponent’s real movement without hit-stop correction to the player. The results showed that the asymmetric presentation of hit-stop can reduce discomfort while making the participant feel a pseudo-impact force.

Shinnosuke Noguchi, Keigo Matsumoto, Yuki Ban, Takuji Narumi
Design of Haptic Rendering Techniques for Navigating with a Multi-actuator Vibrotactile Handle

This paper presents the design and experimental evaluation of haptic rendering techniques for navigating using localized vibrotactile stimuli provided by a custom multi-actuator haptic handle. We present two haptic rendering schemes which are then used in combination with three navigation strategies to guide users along a path. We evaluate these techniques in a user study where 18 participants walk in a $$8\times 8$$ 8 × 8 m room, following haptic cues displayed by the handle. Results show that participants are able to navigate along the path successfully, with success rates ranging from 80% to 100% across conditions.

Pierre-Antoine Cabaret, Claudio Pacchierotti, Marie Babel, Maud Marchal
Evaluating Tactile Interactions with Fine Textures Obtained with Femtosecond Laser Surface Texturing

Tactile perception deteriorates with age, resulting in a negative impact on life quality. A clinical assessment of this decline could help to reduce its effects. Such a clinical apparatus for fine texture does not yet exist. Femtosecond laser surface texturing (LST) is capable of manufacturing fine textures on materials that are sufficiently robust for clinical requirements. This paper starts by addressing how LST can be used to manufacture surfaces for tactile tests, i.e. of sufficient dimensions to permit interrogation, and with a minimum quantity of uncontrolled surface features. Vibrotactile interrogation tests on textured surfaces demonstrate that the surface textures have controllable tactile signature and thus underline the suitability of the process for generating fine textures for tactile perception assessment.

G. Schuhler, H. Zahouani, J. Faucheu, Y. Di Maio, R. Vargiolu, M. W. Rutland
Tactile Clip: A Wearable Device for Inducing Softness Illusion Through Skin Deformation

We proposed the “Tactile Clip,” a device to induce an illusion of softness, and verified its effectiveness in augmenting softness perception. A Tactile Clip is a wearable device that provides circumferential-force stimulation to the feet. This force results in the deformation of the foot skin, consequently influencing the perception of softness. We conducted a psychophysical experiment using the interleaved staircase method. Twelve participants assessed foot softness by comparing the stepping sensation of the reference sample with the Tactile Clip to that of the test samples without the Tactile Clip. From the response rate, we derived a psychometric curve and bias value. The results showed a significantly positive bias in the perception, suggesting that the Tactile Clip made the flooring material feel softer than the actual softness. We consider the factors contributing to this phenomenon as a slight increase in the foot’s thickness by deformation of sole skin and/or cognitive effects due to changes in force stimulation on the side of the foot in response to stepping.

Hikari Yukawa, Natsuno Asano, Arata Horie, Kiryu Tsujita, Takatoshi Yoshida, Kouta Minamizawa, Yoshihiro Tanaka
SENS3: Multisensory Database of Finger-Surface Interactions and Corresponding Sensations

The growing demand for natural interactions with technology underscores the importance of achieving realistic touch sensations in digital environments. Realizing this goal highly depends on comprehensive databases of finger-surface interactions, which need further development. Here, we present SENS3 —www.sens3.net—an extensive open-access repository of multisensory data acquired from fifty surfaces when two participants explored them with their fingertips through static contact, pressing, tapping, and sliding. SENS3 encompasses high-fidelity visual, audio, and haptic information recorded during these interactions, including videos, sounds, contact forces, torques, positions, accelerations, skin temperature, heat flux, and surface photographs. Additionally, it incorporates thirteen participants’ psychophysical sensation ratings (rough–smooth, flat–bumpy, sticky–slippery, hot–cold, regular–irregular, fine–coarse, hard–soft, and wet–dry) while exploring these surfaces freely. Designed with an open-ended framework, SENS3 has the potential to be expanded with additional textures and participants. We anticipate that SENS3 will be valuable for advancing multisensory texture rendering, user experience development, and touch sensing in robotics.

Jagan K. Balasubramanian, Bence L. Kodak, Yasemin Vardar
Variable Curvature and Spherically Arranged Ultrasound Transducers for Depth-Adjustable Focused Ultrasound

In haptic feedback using focused ultrasound, a technique has been proposed that involves creating an ultrasound focal point using a spherical surface without the need for phase control. This method is efficient as each transducer faces directly toward the focal point, but it has a limitation – the focal point remains fixed due to the ultrasonic transducers being attached to a static sphere. This study introduces a method to move the focus in the depth direction by dynamically changing the curvature of the spherical surface to which the transducers are attached. We built a device that can adjust curvature using a servo motor, confirming its capability to relocate the focus position.

Shoha Kon, Eifu Narita, Izumi Mizoguchi, Hiroyuki Kajimoto
The HapticSpider: A 7-DoF Wearable Device for Cutaneous Interaction with the Palm

This paper introduces a 7-degrees-of-freedom (7-DoF) hand-mounted haptic device, the “HapticSpider”. It is composed of a parallel mechanism characterised by eight legs with an articulated diamond-shaped structure, in turn connected to an origami-like shape-changing end-effector. The device can render surface and edge touch simulations as well as apply normal, shear, and twist forces to the palm. This paper presents the device’s mechanical structure, a summary of its kinematic model, actuation control, and preliminary device evaluation, characterizing its workspace and force output.

Lisheng Kuang, Monica Malvezzi, Domenico Prattichizzo, Paolo Robuffo Giordano, Francesco Chinello, Claudio Pacchierotti
Vibrotactile Cues with Net Lateral Forces Resulting from a Travelling Wave

Ultrasonic travelling waves possess the capability to induce net shear forces on the finger pulp, which finds utility in various applications. This study specifically investigates their potential in generating vibrotactile cues through force modulation. Two experiments are outlined herein. In the first, we assess the detection threshold, estimated at 1.5 $$\upmu $$ μ m peak-peak for a modulation frequency of 75 Hz. Remarkably, this threshold closely aligns with that observed in conventional vibrotactile indentation and remains consistent regardless of the force modulation method employed. In the second experiment, we correlate the amplitude of a travelling wave tactile display with the vibration amplitude of a shaker to achieve equivalent vibration intensity. Our findings reveal a substantial amplitude ratio range experienced on the travelling wave device, spanning from 24 to 155, which is pivotal in optimizing vibrotactile stimulator design. Additionally, we demonstrate that vibrations below 1 $$\upmu $$ μ m yield negligible net lateral forces on the finger pulp. These results underscore the potential of ultrasonic travelling waves in enhancing tactile feedback systems.

Mondher Ouari, Anis Kaci, Christophe Giraud-Audine, Frédéric Giraud, Betty Lemaire-Semail
Enhancing the Perceived Pseudo-Torque Sensation based on the Distance between Actuators Elicited by Asymmetric Vibrations

The phenomenon of illusory pulling force, induced by asymmetric vibrations, represents a distinct perceptual experience. This pseudo-force sensation has been explored through various techniques involving the deployment of multiple vibrotactile actuators. Although these actuators can elicit multi-dimensional pseudo-forces, the impact of distance between actuators on the rotational pseudo-force (or torque sensation) remains unexplored. This study examines the influence of vibrotactile actuator distance on the pseudo-torque sensation elicited by asymmetric vibrations. We hypothesized that the distance between actuators impacts the perceived pseudo-torque in line with the principle of leverage. Our findings indicate an increase in perceived pseudo-torque up to a 50 mm separation between actuators, with an optimal distance correlating to 0.43 times the hand size for effective pseudo-torque generation. These insights are crucial for designing haptic devices capable of imparting pseudo-torque sensations.

Tomosuke Maeda, Takayoshi Yoshimura, Hiroyuki Sakai, Kouta Minamizawa
Presentation of Slip Sensation Using Suction Pressure and Electrotactile Stimulation

The accurate detection of incipient object slippage is essential for grip control. This study aimed to replicate the sensation of losing grip by simulating a ‘partial slip’ phenomenon, in which the outer area of a contact point fluctuates slightly while the center of contact stays still. Two types of stimulation methods were used: electrotactile stimulation, chosen for its superior spatial and temporal resolution, which facilitates the precise replication of the partial slip area; and air-suction stimulation selected for its stable pressure sensation. We conducted an experiment to evaluate the effectiveness of each stimulation method, both independently and in combination, in terms of their ability to realistically convey the sensation of slippage, including the perceptual clarity of the slip occurrence and its direction. Results showed that the electrotactile stimulation was proficient in presenting distinct sensations of slippage and its direction while also providing some realism. Moreover, it was observed that the incorporation of suction notably enhanced the realism of the tactile sensation, particularly when used in conjunction with electrotactile stimulation.

Yan Xue Teo, Taiga Saito, Takayuki Kameoka, Izumi Mizoguchi, Kajimoto Hiroyuki

Applications and Interaction

Frontmatter
Blindfolded Operation as a Method of Haptic Feedback Design for Mobile Machinery

Mobile machinery includes a wide range of machines performing specific operations in off-road environments for agriculture, mining, and construction. Haptic feedback is not widely used in mobile machinery due to several obstacles. One of them is the lack of methods for choosing efficient haptic cues that improve the operation of each type of machine. This paper considers the blindfolded operation of a machine as an approach that facilitates the development of efficient haptic feedback. The method includes the development of a haptic-only human-machine interface that makes possible blindfolded operation of the machine and subsequent interface simplification by excluding the signals, which become unnecessary in the presence of visual information.Two examples of developing haptic feedback for mobile cranes are presented. Commercially available haptic joysticks were used as the main controls. The haptic cues developed in the study allowed the transmission of 17 signals to an operator through the haptic joysticks, making the blindfolded virtual log crane operation possible. For a limited remote operation of a real crane, haptic cues for transmitting 6 signals were developed with the same approach. The efficiency of the developed haptic feedback was tested with a small group of participants in a real-time simulator of a log crane and in the remote operation of a real crane in a laboratory environment. The participants reported improved depth perception and lower stress levels in remote operation with haptic feedback. The described approach is not limited to mobile cranes and can be used in other types of mobile machinery.

Victor Zhidchenko, Egor Startcev, Heikki Handroos
Effects of Rendering Discrete Force Feedback on the Wrist During Virtual Exploration

Relocating the haptic feedback from the fingertip to the wrist is a trendy topic in haptic-assisted virtual interactions, and finding its best practices still requires a lot of research. In this paper, we investigate the perceptual and performance differences while rendering haptic feedback on the wrist in single-bump, discrete force feedback (through custom voice coil actuation of CoWrHap) or continuous force feedback (through linear DC actuation of LAWrHap). We conducted a user study experiment where participants interacted with identical-looking virtual objects with different stiffness properties and identified the ones with a higher stiffness level based on the haptic feedback they received. Our results indicate that participants performed the tasks (i) with higher sensitivity (higher JND), with more confidence (Number of Taps), and with better user experience using LAWrHap compared to using CoWrHap, and (ii) with no difference in terms of task accuracy (PSE), exploration and interaction time between using LAWrHap and CoWrHap.

Samet Mert Ercan, Ayoade Adeyemi, Mine Sarac
Viscous Damping Displayed by Surface Haptics Improves Touchscreen Interactions

Virtual targets on touchscreens (e.g., icons, slide bars, etc.) are notoriously challenging to reach without vision. The performance of the interaction can fortunately be improved by surface haptics, using friction modulation. However, most methods use position-dependent rendering, which forces users to be aware of the target choice. Instead, we propose using tactile feedback dependent on users’ speed, providing a viscous feeling. In this study, we compared three viscous damping conditions: positive damping, negative damping, and variable damping (viscosity was high during slow movements and low during fast movements), against a baseline condition with no tactile feedback. These viscous fields are created by changing net lateral forces based on velocity. Results indicate that, during the initial phase of movement when the finger approaches the target, various viscous feedback has an insignificant impact on targeting trajectories and movement velocity. However, positive damping and variable damping significantly influence behavior during the selection phase by reducing oscillation around the target and completion time. Questionnaire responses suggest user preference for viscous conditions and disapproval of negative viscous forces. This study provides insights into the role of viscous resistance in touchscreen interactions.

Zhaochong Cai, Michaël Wiertlewski
Latency Compensation in Ultrasound Tactile Presentation by Linear Prediction of Hand Posture

Interaction systems using ultrasound haptics technology present tactile stimuli in fixed coordinates in space; hence, system delays cause not only temporal differences in the stimuli but also spatial shifts in presentation points when targets are moving. In particular, if the transducer array is placed surrounding the hand workspace, a shift in ultrasound focus could result in providing strong tactile stimuli to unintended parts of the hand, such as the opposite side of the fingers. In this study, we examine the feasibility of mitigating the delay effect by predicting the surface shape of the hand. The verification system fits the hand surface shape acquired by a depth camera with a hand model represented by low-dimensional posture parameters, and then performs Kalman prediction on the parameter transitions. The results of the user study show that for finger contacts under constant velocity motion conditions, the prediction method can mitigate the decrease in perceived intensity due to ultrasonic focus shift and increase in perceived intensity at undesirable areas.

Atsushi Matsubayashi, Yasutoshi Makino, Hiroyuki Shinoda
Pseudo-Frequency Modulation: A New Rendering Technique for Virtual Textures

Creating virtual textures with speed-invariant identities requires spatial constancy, attained by changing temporal frequency with a finger’s sliding speed. Implementing sensations of continuous frequency change, however, is non-trivial for low-cost wearable vibrotactile displays, as the amplitude of commonplace voice coil motors and linear resonant actuators varies strongly with driving frequency. In this paper, we present Pseudo-Frequency Modulation (PFM), a technique to continuously change the perceived frequency by modulating only the amplitudes of two single-frequency components in response to changes in sliding speed. Results of a psychophysical study with a wrist-worn vibrotactile display show that people perceive smooth changes in frequency with PFM, and this technique can be reliably used to display distinguishable textures which differ in terms of perceived surface properties such as coarseness.

Paras Kumar, Rebecca F. Friesen
Do Vibrotactile Patterns on both Hands Improve Guided Navigation with a Walker?

Current assistive technologies for navigation guidance using non-visual feedback often rely on audiovisual cues only. Haptic cues can also provide rich information while leaving the auditory sensory channel unencumbered. By enhancing the sensory information available for guidance, haptics has the potential to improve the adoption of guiding technologies by people with, e.g., visual impairments. This paper evaluates the capacity of tactile patterns delivered by haptic handles to guide users walking with a walker while they are asked to follow a predefined path. We conducted a user study on 18 participants who used two haptic handles mounted on a walker in actual walking condition. We implemented three types of vibrotactile patterns for guidance: uni-manual (one handle), bi-manual (two handles), and dual (combining one-handle and two-handles patterns) were used depending on the direction. We also compared vibration and tapping stimulation modes to test for their potential influence on the guiding strategy and the user’s preference. Results showed no significant effect of the strategy or the stimulation mode on the accuracy of following the target path. However, they showed that bi-manual conditions presented a higher satisfaction rate, gave a sensation of being mentally less demanding to users, improved the confidence rate in succeeding the task, and increased the navigation speed.

Inès Lacôte, Pierre-Antoine Cabaret, Claudio Pacchierotti, Marie Babel, David Gueorguiev, Maud Marchal
Data-Driven Haptic Modeling of Inhomogeneous Viscoelastic Deformable Objects

This work provides a new approach for modeling of inhomogeneous viscoelastic deformable objects. The approach is validated on a dataset collected at multiple locations of three different inhomogeneous deformable objects. The dataset consists of position and force measurements corresponding to single finger normal interaction. The approach, first, employs the principles of feature-based learning and perceptual adaptive sampling mechanisms to reduce the dataset. Then, a single random forest-fractional derivative (RF-FD) based data-driven model is trained on the reduced dataset to estimate a non-parametric relation between position and force samples for each deformable object. Thus, the proposed approach requires just one trained model to predict interactions at unknown locations of the object with good accuracy, unlike the existing clustering based solution in the literature where one model is trained for each cluster. Our results demonstrate that the proposed approach provides a better prediction accuracy in estimating the responses on inhomogeneous objects as compared to the existing solution in the literature in terms of the relative root mean square error (less than 0.15) and maximum error (less than 0.75 N).

Gautam Kumar, Shashi Prakash, Hojun Cha, Amit Bhardwaj, Seungmoon Choi
Perception of Paired Vibrotactile Stimulus on the Upper Limb: Implications for the Design of Wearable Technology

The ability to correctly perceive multiple stimuli represents an important barrier to the use of vibrotactile devices in training complex behaviors. The aim of this study was to evaluate how stimulation parameters influence perception of vibrotactile patterns applied to the forearm and upper arm. In this experimental protocol, participants (N = 16) were asked to compare two vibrotactile sequences and indicate whether they were the same or different. We examined the effects of (1) sequential versus simultaneous vibrotactile stimulation; (2) the temporal structure of the vibrotactile stimulus upon subject perception and (3) difference in pattern recognition between the two segments. Our results confirmed that perception was generally superior when the two stimuli were presented sequentially and when there was a marked difference in the temporal structure of the signals. At the same time, participants were highly capable of detecting two identical sequences when presented simultaneously. These findings may have important implications for the design of wearable vibrotactile devices intended for guiding upper limb movement.

Dorine Arcangeli, Gabriel Arnold, Agnès Roby-Brami, Giovanni de Marco, Nathanaël Jarrassé, Ross Parry
MoveTouch: Robotic Motion Capturing System with Wearable Tactile Display to Achieve Safe HRI

The collaborative robot market is flourishing as there is a trend towards simplification, modularity, and increased flexibility on the production line. But when humans and robots are collaborating in a shared environment, the safety of humans should be a priority. We introduce a novel wearable robotic system to enhance safety during Human-Robot Interaction (HRI). The proposed wearable robot is designed to hold a fiducial marker and maintain its visibility to a motion capture system, which, in turn, localizes the user’s hand with good accuracy and low latency and provides vibrotactile feedback to the user’s wrist. The vibrotactile feedback guides the user’s hand movement during collaborative tasks in order to increase safety and enhance collaboration efficiency. A user study was conducted to assess the recognition and discriminability of ten designed vibration patterns applied to the upper (dorsal) and the down (volar) parts of the user’s wrist. The results show that the pattern recognition rate on the volar side was higher, with an average of 75.64% among all users. Four patterns with a high recognition rate were chosen to be incorporated into our system. A second experiment was carried out to evaluate users’ response to the chosen patterns in real-world collaborative tasks. Results show that all participants responded to the patterns correctly, and the average response time for the patterns was between 0.24 and 2.41 s.

Ali Alabbas, Miguel Altamirano Cabrera, Mohamed Sayed, Oussama Alyounes, Qian Liu, Dzmitry Tsetserukou
Evaluation of HaptiComm-S for Replicating Tactile ASL Numbers: A Comparative Analysis of Direct and Mediated Modalities

This research investigates the efficacy of HaptiComm-S, a haptic communication device designed to facilitate tactile communication for Deafblind individuals. The primary focus is on evaluating the device’s capability to replicate the tactile American Sign Language (ASL) numbers 0 to 10. Participants performed under two distinct conditions: direct ASL signing and mediated ASL signing through two modalities (Tap and Tap-and-Hold). Our findings demonstrate significant differences in performance between the Direct and Mediated ASL modes. Direct ASL consistently exhibited higher accuracy compared to mediated conditions. Mediated ASL conditions were prone to perceptual errors in number identification. Notably, specific numbers, such as 4, 7, 8, and 9, posed challenges in the mediated conditions, often resulting in confusion among participants. These findings contribute valuable insights for the ongoing refinement in the design of haptic communication devices tailored to the needs of the Deafblind community.

Mounia Ziat, Nurlan Kabdsyhev, Sven Topp, Basil Duvernoy, Jeraldine Milroy, Zhanat Kappassov
Memorable Vibration Pattern Design Based on Writing Pattern

In this paper, we presented memorable vibration patterns representing digit 0 to 9, which were designed based on writing patterns. Based on the collection of 50 participants’ handwriting pattern of 10 digits we gathered, we designed two different types of vibration patterns to generate the digits: vibrotactile flows and discrete vibrotactile simulations. In the user study, we evaluated identifiability and learnability of the patterns we generated. First, participants successfully identified 69.4% of vibrotactile flow patterns and 77.5% with discrete vibrotactile simulations in their first session of 30 trials without training. The average recognition rate in their last session 30 trials increased to 83.6% for vibrotactile flows and 91.1% for discrete vibrotactile simulations after two sessions (60 trials), shows the ease of learning the vibration pattern. We also observed a lasting learning effect of both types of vibrotactile patterns in a delayed recall test was conducted 72–96 h after the first user study – 90.0% success rate for vibrotactile flows and 91.4% for discrete vibrations.

Zi Ying Wong, Yongjae Yoo, Sang-Youn Kim
Estimating Contact Force Rate Using Skin Deformation Cues

Knowledge of contact force is important in various fields as it provides essential insights into the interactions between objects. Understanding the forces exerted during human-machine interactions is essential for optimizing performance and preventing discomfort or injury. Despite the significance of contact force, there are limited approaches for estimating it accurately when direct measurements are not available. Existing methods often come with constraints such as complexity, cost, and environmental limitations. This paper introduces a method for estimating contact force from skin deformation cues in bare-finger interactions using linear regression models. The statistical results from the linear models align with scientific studies, revealing a significant correlation between force and skin deformation that has a dependence on stimulus moduli. The developed models exhibit reliable estimations of force, with robustness to variances between trials and insensitivity to individual differences in skin properties. This approach offers an alternative method for estimating contact force and indicates the potential of models for estimating key variables associated with user experience.

Bingxu Li, Gregory J. Gerling, Tyler Cody
Move or Be Moved: The Design of a Haptic-Tangible Manipulative for Paired Digital Education Interactives

One of the current limitations in digital educational experiences is the lack of touch. Touch is a critical component in the learning process and in creating inclusive educational experiences for sensorially diverse learners. From haptic devices to tangible user interfaces (TUI), a growing body of research is investigating ways to bring touch back into the digital world, yet many focus on a specific dimension (e.g. haptic feedback or kinesthetic manipulation) of touch. Learning, however, is a multi-dimensional touch experience - it is about moving and being moved. This work presents the Action Quad - a novel haptic-TUI design for teaching geometry (specifically quadrilaterals). The Action Quad is a multi-point-of-contact, reconfigurable tool that synergizes the affordances of both kinesthetic interaction and haptic feedback into a single form factor. We present findings from an initial user study (N = 11) investigating how sighted-hearing individuals approach, interact, and experience the Action Quad, and we present a case study with an individual with blindness. We share key takeaways from the design process and participant feedback on interactions with this novel haptic-TUI device, sharing design insights on an emerging area of research that could support a new class of educational learning tools rooted in touch.

Scott George Lambert, Jennifer L. Tennison, Kyle Mitchell, Emily B. Moore, Jenna L. Gorlewicz
High–Fidelity Haptic Rendering Through Implicit Neural Force Representation

Recent research has demonstrated that neural networks using periodic nonlinearities may be used for implicit representation and reconstruction of continuous-time signals. Starting with a previously published network for representing the Signed Distance Function (SDF) of a mesh surface, we extend the concept and lay the foundation for introducing the additional representation of the Unit Normal Function (UNF). With the representation of these two functions at hand, we construct a penalty-based haptic rendering method. Our experiments suggest that this proposed method is able to handle very large meshes better than other competing alternatives, producing high-fidelity forces, free of discontinuities, by sampling a continuous implicit force function at the desired spatial accuracy.

Christoforos Vlachos, Konstantinos Moustakas
“It’s Like Being on Stage”: Staging an Improvisational Haptic-Installed Contemporary Dance Performance

There is increasing research exploring how to augment expressive movements in dance practices by using haptic technologies. Meanwhile, less is known about how the audience perceives such information. In this study, we explore the potential of using a haptic wristband to convey contemporary dancers’ performative somatic information to the audience through real-time control of haptic feedback by a haptic DJ. We then evaluate audience members’ expectations towards the haptic-enabled viewing dance in a public performance setting. Participants indicated satisfaction with the improvisational haptic dance viewing experience.

Xuan Li, Ximing Shen, Youichi Kamiyama, Danny Hynds, Arata Horie, Sohei Wakisaka, Kouta Minamizawa
Backmatter
Metadaten
Titel
Haptics: Understanding Touch; Technology and Systems; Applications and Interaction
herausgegeben von
Hiroyuki Kajimoto
Pedro Lopes
Claudio Pacchierotti
Cagatay Basdogan
Monica Gori
Betty Lemaire-Semail
Maud Marchal
Copyright-Jahr
2025
Electronic ISBN
978-3-031-70058-3
Print ISBN
978-3-031-70057-6
DOI
https://doi.org/10.1007/978-3-031-70058-3