Introduction
Despite the positive health effects, there is a high incidence of lower extremity injuries during running [
1,
2]. Although estimates suggest that 10–20% of Americans run regularly, with 40–50% of these injured annually [
3], causation is more complex, with a survey of results across 17 published studies, involving a range specific population characteristics (age, experience, gender, etc.) showing annual injury rates can vary from 19 to 79% [
4]. Among these injuries, half occur at the knee joint, with patellofemoral pain (PFP) being the most common diagnosis [
2,
4]. PFP can lead to severe pain and disability and is a precursor of knee osteoarthritis [
5].
Joint moments can be used as an indicator of joint loading and have potential application for sports performance and injury prevention. Peak knee flexion moment and flexion moment impulse are related to the progression of patellofemoral joint (PFJ) osteoarthritis [
6]. Increased knee flexion moment is suggestive of greater quadriceps force requirements and has been reported to result in higher PFJ reaction force and stress [
7,
8].
Real-time feedback is a promising strategy for motion retraining. Visual or tactile feedback have been implemented to alter knee and impact loading [
9‐
13]. The use of vibrotactile feedback in several medical and non-medical areas has been established [
14]. Individualized data-driven models were used to train novel gaits involving a combination of kinematic modifications [
15]. In a comparative study by [
16], haptic feedback combined with visual feedback yielded better task learning performance for the lower extremity, compared to visual feedback or haptic feedback alone. A more recent review by [
17] documented that studies focused on the clinical applications of wearable feedback for human gait often used haptic and auditory feedback sensations. Both visual-auditory feedback and visual-tactile feedback provide advantages in reducing reaction times and improving performance [
18]. Visual-tactile feedback is more effective when multiple tasks are performed and cognitive workload conditions are high [
18]. In a study that evaluated assistive navigation systems for the blind, auditory feedback resulted in a 22 times higher cognitive load than haptic feedback [
19]. Previous studies have observed that vision feedback provides a high degree of precision [
12]. Vibration provides simple and intuitive feedback, particularly when vision is otherwise occupied [
15]. In addition, vibration conveys Cartesian space directional cues well.
Haptic feedback is increasingly becoming an essential component for maximizing the effectiveness of the interaction between the human user and a machine. Using touch to communicate with users, haptic feedback provides a relative sensation that is important in daily exploration tasks. It can also be a means of delivering cues to a user learning new motor skills [
20] or for patients undergoing rehabilitation therapy [
21]. As haptic systems are being developed as wearable devices, this technology is finding a surge of applications in healthcare, virtual reality, remote assistance, and robotics [
22]. Some common examples of haptic feedback in everyday life includes the vibration alerts in a modern smart watch or the resistance given to the driver by the car’s electric power steering system. There are many different types of haptic feedback modalities that are used for different tasks and applications. This paper explores the different modalities used and discusses the use of vibrotactile feedback during locomotion.
During skin stretch, the surface of the haptic device imparts a shear force on the user’s skin to excite its mechanoreceptors. By stretching the skin tangentially, skin-stretch feedback can give directional information to the user [
23]. A study by Norman et al. demonstrates the effectiveness of a simple fingerpad skin stretch device to guide a user’s arm via haptic cues and real-time corrective feedback [
24]. With the motivation to increase embodiment between amputees and their prosthetic device, Battaglia et al. evaluated the ability of a rotational skin-stretch haptic wearable to convey proprioceptive information of a robotic hand [
25]. For lower limb amputees, Husman et al. proposes the use of a lateral skin-stretch haptic wearable to cue the user of gait events during ambulation [
26].
During electrotactile feedback, electric signals stimulate nerves in the skin via surface electrodes. The main benefit of this modality is that there are no moving parts and it can deliver a variety of different sensations compared to other forms of feedback [
27]. An experiment by Pamungkas et al. describes an electrotactile feedback system that conveys surface properties of a remote object to the back of the user’s hand [
28]. Using amplitude modulated electrotactile feedback to the neck, Arakeri et al. developed a system that provides information regarding the grip force and closure of a hand grasping an object [
29].
Vibrotactile feedback is perhaps one of the most commonly recognized types of haptic feedback as it is found in mobile phones and gaming console controllers. Vibrotactile actuators become ideal in many haptic applications due to its low cost, small size, and its ability to be effective when placed at almost anywhere on the body [
30]. When combined with motion capture technology, vibrotactile feedback can be used to help students learn a new motor skill such as playing the violin [
20]. More notably, vibrotactile feedback systems are researched in areas that would help improve gait performance for the elderly that suffer from the risk of falling or patients that experience a functional disability after stroke. A study by Lee et al. demonstrates the efficacy of vibrotactile cueing to prevent falls using a split-belt treadmill to simulate unpredictable perturbations [
31]. A portable gait asymmetry rehabilitation system by Azfal et al. delivers vibrotactile cues based on gait phase measurement to improve gait symmetry for individuals with stroke [
21,
32]. Two studies demonstrated that haptic feedback can be used to identify and retrain gait parameters such as toe-in/toe-out configuration and stride length during walking [
33,
34]. A separate study have shown positive results among patients who require gait guidance and suffer from gait abnormality due to lack of balance for rehabilitation [
35].
Researchers proposed using multiple haptic modalities in their device to provide multimodal sensory feedback. Alonzo et al. proposed stacking vibrotactile stimulators on top of electrotactile stimulators to make the system more compact [
36]. Another wearable haptic device could deliver skin-stretch, pressure, and vibrotactile to convey information about the status of the teleoperated robot and it has been shown to effectively improve the user operation performance [
37]. Skin stretch is a natural sensing mode for proprioception, thus making it ideal to intuitively convey proprioceptive information to the user [
25], even when compared to vibrotactile feedback [
38]. As an alert scheme, vibrotactile feedback was found to be superior to electrotactile feedback in terms of accuracy and user comfort [
39]. Vibrotactile feedback systems have also been shown to be an effective and non-invasive method to convey information or cues that is safer than electro- and thermal feedback [
40].
Computer simulations with accurate musculoskeletal models can provide detailed insights into the biomechanics of walking [
41] and running [
42] during treatments. In the biomechanics community, highly accurate human models of lower and upper extremities taken from cadaveric specimens have been used to investigate muscle coordination to identify sources of pathological movement and to establish a scientific basis for treatment planning and design [
43,
44]. Several studies have utilized biomechanical modeling and dynamic simulations of the musculoskeletal system to identify the contributors to an individual gait [
45‐
53]. Metabolic cost models have also been introduced for the improvement of robotic assistance that considered passive dynamics [
54] and fully actuated systems for human walking [
55‐
58]. OpenSim [
59] is a widely used biomechanical modeling and analysis application that introduced several innovations in: joint modeling [
60], multi-body and contact modeling, and numerical methods [
61] to the biomechanics community. It provides biologically accurate joint and muscle models that can be used to create anatomically accurate musculoskeletal systems. However, since OpenSim utilizes numerical methods to estimate motion dynamics, it cannot be used to simultaneously track and analyze the dynamics of motion in real-time.
There exists a plethora of simulation software that can be used to model and analyze multi-body systems, some of which are commercially available, while others are in open source. Current software systems that can be used to build and analyze human and animal models include: LifeModeler (commercial) [
62], AnyBody (commercial) [
63], Visual3D (commercial) [
64], SIMM [
43], D-Flow (commercial) [
65,
66], V-REP (commercial) [
67], and OpenSim [
59] (open source). AnyBody, Visual3D, and D-Flow are only capable of inverse dynamics. Other software systems can be used for forward dynamics, but they require pre-calculated muscle activations, thus limiting their use for predicting patient response to medical interventions [
68]. OpenSim, Anybody, LifeModeler, and SIMM lack live dynamic simulation capabilities; they cannot be used to simultaneously track and analyze motion in real-time. D-Flow, Visual3D, and V-Rep, although capable of live simulations, are not open source. Moreover, simulation development within these software systems is often cumbersome, and interfacing with third-party systems (i.e. VR equipment, IMUs, haptic systems) is not straightforward for the average developer. Developing a simulation framework within a widely supported engine, such as Unity or Unreal Engine, would be much more practical because the resulting products can be designed to be scalable, highly customizable, easy-to-use, and open source.
Despite all the recent advances in biomechanics, robotics, and computer animation research, there is no established scientific understanding of how real-time multi-modal feedback integrates into locomotion training to improve motor learning and performance [
14]. In addition, vibrotactile stimulation as a feedback tool in sports has not been supported by scientific evidence [
14]. Finally, there is no unified and portable framework that integrates real-time sensing and feedback with human biomechanical models.
In this paper, we present a pilot study on locomotion training via a wearable haptic feedback system and the use of biomechanical modeling. This provide us with preliminary results to understand the effect of real-time vibrotactile feedback to elicit motor adaptation in locomotion. This work builts upon on our recent results on human perception accuracy of vibrotactile feedback during locomotion [
69]. In addition, a novel simulation framework for motion tracking and analysis is introduced. This unified framework, implemented within the Unity environment, is used to analyze a subject’s baseline and performance characteristics, and to provide real-time haptic feedback during locomotion. A notable advantage of building the framework within the Unity environment is that the user has access to Unity’s extensive Asset Store [
70], which contains a plethora of assets that the user can integrate with the simulation framework when building custom motion analysis applications. The framework incorporates accurate musculoskeletal models derived from OpenSim, closed-form calculations of muscle routing kinematics and kinematic Jacobian matrices, dynamic performance metrics (i.e., muscular effort) [
71], human motion reconstruction via IMU sensors, and real-time visualization of the motion and its dynamics.
Conclusions
Since the 1970s, running popularity has continuously grown as a professional and recreational sport. It is estimated 65 million people participated in this activity in United States alone in 2017 [
91]. Between 1990 and 2013, road race finishers grew from five millions to over 19 million [
92]. Contributing to its popularity, running was proved to have major health benefits, such as improving cardiovascular endurance and overall quality of life, and decreasing the prevalence of Type 2 diabetes, obesity, and hypertension [
93]. In the U.S., 10–20% of the population run regularly, with 40–50% of those injured annually [
3]. Among these injuries, half occur at the knee joint, with patellofemoral pain (PFP) being the most common diagnosis [
2,
4]. PFP can lead to severe pain and disability and is a precursor of knee osteoarthritis [
5]. There lies a huge potential for sports science and physical therapy to use feedback mechanisms as intervention tool [
14]. One of the advantages of motion feedback is the enhancement of a user’s ability to function in a cognitively overloaded situation, such as a multi-task scenario (e.g., running while adapting to postural changes for one or more segments). Our framework is unique in that it integrates portable sensors, models motion dynamics in real-time, and provides concurrent feedback to improve running.
A limitation in this study was that motion training was performed on healthy subjects without prior injury. To have scientifically correct subject-specific modeling and scaling for our biomechanical model in this study, we did not include patients or the elderly, although this can be extended in future work. History of lower extremity or low back surgery may affect running kinematics, kinetics or muscle activation. Similarly, lower extremity or low back pathology may cause pain or discomfort during running. Older patients may also have difficultly remembering gait modifications trained with real-time feedback, particularly if the new gait patterns were a complicated combination of movement alterations. Other future work includes the integration of a multi-modal feedback mechanism in the framework, as subject’s perception of feedback modalities may vary based on the age, gender, fitness level, and injury history.
This pilot study demonstrated the feasibility of providing real-time haptic feedback for motion training using a fully portable, model-based framework. While the proposed framework and pilot study address improving running kinematics and associated health outcomes, future studies should be associated with utilizing the framework with modified models for use with activities of daily living (ADL) and sport activities. Improved performance of ADL’s will assist with elderly populations in fall reduction and disabled communities with impaired sensory systems. With the rapid increase in repetitive sport injuries, use of our framework with sport specific modeling procedures and learning protocols may provide mechanisms to analyze and improve kinematics and kinetics for the purposes of injury reduction and improved performance.
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.