Skip to main content
Top
Published in:
Cover of the book

Open Access 2018 | OriginalPaper | Chapter

11. Haptics for the Development of Fundamental Rhythm Skills, Including Multi-limb Coordination

Authors : Simon Holland, Anders Bouwer, Oliver Hödl

Published in: Musical Haptics

Publisher: Springer International Publishing

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

This chapter considers the use of haptics for learning fundamental rhythm skills, including skills that depend on multi-limb coordination. Different sensory modalities have different strengths and weaknesses for the development of skills related to rhythm. For example, vision has low temporal resolution and performs poorly for tracking rhythms in real time, whereas hearing is highly accurate. However, in the case of multi-limbed rhythms, neither hearing nor sight is particularly well suited to communicating exactly which limb does what and when, or how the limbs coordinate. By contrast, haptics can work especially well in this area, by applying haptic signals independently to each limb. We review relevant theories, including embodied interaction and biological entrainment. We present a range of applications of the Haptic Bracelets, which are computer-controlled wireless vibrotactile devices, one attached to each wrist and ankle. Haptic pulses are used to guide users in playing rhythmic patterns that require multi-limb coordination. One immediate aim of the system is to support the development of practical rhythm skills and multi-limb coordination. A longer-term goal is to aid the development of a wider range of fundamental rhythm skills including recognising, identifying, memorising, retaining, analysing, reproducing, coordinating, modifying and creating rhythms—particularly multi-stream (i.e. polyphonic) rhythmic sequences. Empirical results are presented. We reflect on related work and discuss design issues for using haptics to support rhythm skills. Skills of this kind are essential not just to drummers and percussionists but also to keyboards’ players and more generally to all musicians who need a firm grasp of rhythm.

11.1 Introduction

The role of the sense of touch in musical skills and the use of haptic devices to support musical activities are explored throughout this book. In this chapter, we focus on the use of haptics for learning fundamental rhythm skills, in particular skills typically learned though multi-limb coordination. The motivation for using haptics for this purpose relates to the different strengths and weaknesses of different sensory modalities. Vision is poor at tracking rhythms in real time, due to its lack of fine temporal discrimination, while hearing is considerably more accurate. However, when learning to recognise and play multi-limbed rhythms, neither hearing nor sight is well suited to communicate which limb does what and when, or how the limbs coordinate to form complex patterns. This is an area in which haptics can excel, by applying separate haptic signals to individual limbs. With this goal in mind, we have developed a system called the Haptic Bracelets and explore several applications in this chapter. The Haptic Bracelets are wearable haptic devices designed to help people learn multiple simultaneous (i.e. polyphonic) rhythmic patterns. Although the bracelets are fundamentally simple in conception, and although they make use of elements common in other haptic systems, in some respects they occupy a little explored part of the design space. In particular, they require different aspects of human cognition, perception and motor skills to be taken into account when considering some of the opportunities and affordances they present.
In simple terms, the bracelets are wearable haptic devices designed to be worn by an individual (or, for some applications, by pairs of individuals, or groups) on all four limbs (two wrists and two ankles). Each bracelet contains (Fig. 11.1): a high-resolution inertial measurement unit (IMU)1; precise, fast acting vibrotactiles2 with a wide dynamic range; a processor; and a Wi-Fi module (RN-XV Wi-Fly3). Each set of four bracelets is coordinated by a master processor, typically on a smartphone or laptop. Where more than one user is involved, master processors communicate with one other.
In terms of basic operation, the bracelets can sense what actions a drummer is making with each limb and when. This can also be directly communicated from one drummer to another, as explored below. The bracelets have a range of musical applications, which we will consider in depth in this chapter, including the following:
  • The Haptic iPod;
  • Drum teaching, with matching sets worn by teacher and learner;
  • Musician coordination and synchronisation;
  • Teaching multi-limb drum patterns by multi-limbed haptic cueing.
The above applications can be valuable not just to drummers, but to any musicians who need a firm grasp on how rhythmic patterns interlock. Arguably, this applies to all musicians, but especially to those who play polyphonic instruments or who have complex rhythmic interactions with other players.
Interestingly, the Haptic Bracelets have also found applications in the digital health domain, particularly in rehabilitation for sufferers from a range of movement-related neurological conditions including stroke, Parkinson’s, Huntingdon’s and brain trauma [14]. However, this is mostly outside of the scope of this chapter.
There is a wealth of existing research on the use of haptics for communicating different kinds of information, for example notifications [5], posture improvement [6, 7], tempo synchronisation among musicians [8, 9] and more generally for conveying information about different categories of phenomena such as forces [10], shapes, textures, moving objects, patterns and sequence ordering, as reviewed in [11]. Conversely, there is rather less research on the use of haptics for communicating precise temporal patterns, especially multiple simultaneous temporal patterns. Work in broadly related parts of the design space is reviewed in Sect. 11.5.
In order to understand how people perceive and deal with rapid temporal patterns, it helps to be aware of theories of biological entrainment and neural resonance theory—both of which are reviewed in the next section.

11.2 Motivation and Theoretical Background

The motivation and theoretical background for the Haptic Bracelets is drawn from a variety of sources, as we explore below. The original motivation for the bracelets came from music education, specifically Emil Dalcroze’s Eurhythmics. Theoretical insights came from research in music perception by Bamberger [12], Lerhahl and Jackendoff [13], and others, as well as from work in ethnomusicology by Arom [14]. Once various prototype versions of the bracelets were built [2, 3, 11, 15, 16], research from cognitive science, particularly theories of human biological entrainment and neural resonance theory, proved invaluable in understanding key aspects of how humans interact with the bracelets.

11.2.1 Dalcroze Eurhythmics

The Swiss music educator Emil Dalcroze (1865–1950) noticed that many of his students seemed to read and play music notation stiffly, as an abstract activity, with little evidence of feeling the rhythms in their bodies [17]. By contrast, when observing musicians in Algeria, he noticed that musicians seemed to feel music in their whole bodies, engaging more deeply with complex rhythms. Dalcroze devised a wide range of physical musical games, culminating in the educational system known as Dalcroze Eurhythmics,4 still widely influential and in use today [17]. Amongst other things, this involves students listening to music while moving arms and legs independently, to mirror movement in different simultaneous streams in the music.

11.2.2 Metrical Hierarchies and Polyrhythms

Further theoretical insights come from research in music perception and musicology, reflecting longstanding insights by musicians. To musical novices, musical rhythm may seem like “one event after another”. However, as Lerdahl and Jackendoff and other theorists demonstrated, nearly all Western music is governed by metre. Metre may be viewed as a series of hierarchically coordinated and exactly synchronised temporal layers—each typically highly regular—with interesting exceptions [18]. While there are vital other aspects to rhythm, for example figures, duration, dynamics, accents and syncopation, nevertheless this means that many aspects of coordinating rhythm can be effectively offloaded from the cognitive system and onto the sensorimotor system by learning to assign different regular repeating patterns to each limb.5 This can be learned by starting with just two limbs and then adding additional limbs. In some non-Western musical traditions, polyrhythmic organisation is used instead of hierarchical metre. In this case, the temporal layers are not organised hierarchically—however, each layer is still typically highly regular, and periodically all of the layers still reach synchronisation points [14]. Consequently, the same principles about moving load from the cognitive system onto the sensorimotor system are relevant.

11.2.3 Cognitive Science: Entrainment and Neural Resonance

In addition to domain-specific theories from music education, music psychology and musicology, various theories from cognitive science help to cast light on the Haptic Bracelets. The most widely applicable of these are the theories of embodied interaction [19] enactive cognition [20] and sensory motor contingency [21]. Broadly speaking, these theories focus not just on purely mental processes, but on the physical enaction of target skills and on sensorimotor interactions that engage the whole body and give participants multi-sensory feedback on how their actions affect their surroundings. However, there are two theories from cognitive science that have much more specific relevance to learning multiple simultaneous rhythmic patterns, namely the theories of biological entrainment and neural resonance, considered below.
Entrainment is a term, originally from physics, to describe how two or more physically connected rhythmic processes interact with each other in such a way that they adjust towards and eventually “lock in” to a common periodicity or phase. However, the concept has been found to have rich and unexpected applications in perception, neuropsychology and music psychology at a variety of different levels [2224]. At the interpersonal level, musicians have a strong tendency to entrain with each other when playing. This is more interesting than it might appear on the surface, because when two or more musicians play together—despite being demonstrably in time with each other—it may be the case that they rarely or even never play notes at the same time. In the case of entrained musicians, typically what is happening is that, instead of being entrained to the musical surface, both players are entrained to a beat (part of the metre or polyrhythm) that may often be implied rather than being explicitly sounded.
To sharpen this point, most people, musicians and non-musicians alike are able to tap along metronomically to monophonic melody or rhythm. However, at many points where a tap sounds, there may be no surface event in the music. Conversely, there may be many events in the music at which no tap occurs. What is particularly interesting about this, for our purposes, is that the ability to extract a beat from an irregular musical surface appears to be an almost exclusively human ability (with notable exceptions identified below). Theorists have created diverse computational and psychological theories to try to account for this ability and for the musical ubiquity of metre and polyrhythm. The best current explanation comes from neural resonance theory.
Neural resonance is a theory [23, 24] proposing that humans have a specialised neural organ, which consists of a bank of actively powered oscillators with temporal periods covering the range from about 0.2 to 2 s. Many phenomena in music perception can be well explained by the way in which these hypothesised oscillators tend to entrain with sensory input. Mathematical models of this organ, based on known characteristics of neural oscillators, are able to reproduce the results of human tapping experiments well, not just for metrical rhythms but also for polyrhythms [23]. The theory of neural resonance also helps to explain the origins of musical metre: given a simple regular external beat with frequency f, not just the neural oscillator with frequency f will entrain, but also, to a lesser extent, those with frequencies 2f, 3f, f/2 and f/3.
It was originally thought that beat extraction was unique to humans. Indeed, human neonates can extract beats at birth [24], whereas it has been evidenced by EEG studies that Macaque monkeys are unable to extract beats [25]. However, it was unexpectedly discovered [26] that speech-imitating birds such as the sulphur-crested cockatoo Cacatua galerita eleonora have expert beat extraction abilities. The vocal learning hypothesis [26] suggests that rhythmic entrainment abilities may have developed evolutionarily as a by-product of vocal learning mechanisms.

11.3 Applications of the Haptic Bracelets

In this section, we consider four categories of musical use of the Haptic Bracelets that we have prototyped and explored. There is some overlap, but the categories help to illuminate the design space and involve different software.

11.3.1 The “Haptic IPod”

One of the many uses of the Haptic Bracelets is as part of a portable Haptic Music Player or “Haptic iPod” (Fig. 11.2). For this application, the user listens to music on a smartphone, but with the crucial feature that, in time with the music, they can feel in the appropriate limb (by vibrotactile pulses, as detailed below) which limb the drummer uses to strike a drum and when.
Users may engage with the system in a variety of ways to learn rhythms, for example by silently air drumming in time to the music, or if seated by tapping with hands and feet on nearby surfaces, or by “thigh slapping”—both recommended ways of learning rhythms [27]. It is straightforward for the system to sense virtual or actual impacts and to sonify with chosen drum sounds, should this be desired.
For those wishing to improve their sense of rhythm, or multi-limbed rhythmic skills, this Haptic iPod application has the potential to be a compelling application, for the following reasons.
In the case of drummers who are already expert, they can play what they feel (or imagine) because they have played and felt similar rhythms many times before. When hearing a rhythm being played by another drummer, they may recognise it as something they can play—often already feeling in the imagination which limb should be playing which part of the multichannel rhythm. They have typically internalised a mental model of what a drummer’s arms and legs can do, by playing and listening over many years to rhythms, watching, hearing and trying to replicate what other drummers play. By contrast, for those with little or no drumming experience, the step between hearing a multichannel rhythm and learning to play it is not automatically coupled with the feel of what each limb does. This may not be a major obstacle when hearing a single channel rhythm, provided that the tempo is within limits, and the complexity of the rhythmic pattern falls within the range of what can be grasped and memorised. However, when rhythms involve multiple channels and require multiple limbs to be played in a coordinated manner, the task is much harder. In these circumstances, a lack of experience with how different limb movements can interrelate and with how different limbs are associated with different drum sounds will weaken the ability to transfer from hearing to playing. This is where haptics can offer a distinctive advantage. Coupling multichannel musical rhythms to multichannel haptics allows a person to feel the different channels in different limbs, thereby easing the transition from hearing to playing, via feeling. A similar rationale applies to all of the applications of the Haptic Bracelets considered below.
Crucially, the theory of entrainment plays a key role in this explanation. In particular, there is no suggestion that users will learn rhythms reactively by a process of stimulus response as in behavioural theories—reacting to each hit as it occurs. Such a process would not be well suited to temporal synchronisation. Rather, for typical musical materials, the streams for each limb will tend to consist of, predominantly but not exclusively, short repeating patterns or figures. Consequently, after initial listening, users are generally able to entrain to and reproduce the streams (see Sect. 11.4).
For the prototype version of this system, a laptop running a DAW6 was used rather than a smartphone, and the stereo audio track had an associated manually prepared synchronised MIDI track that mirrored the drum part. The MIDI drum tracks were used to drive the vibrotactiles on the bracelets, as seen in [29]. In future versions of the system, no manual pre-processing of the audio need be involved: software for automatic drum part extraction could be used—though this would identify drums rather than limbs, which has certain limitations—this design issue is discussed in Sect. 11.5.

11.3.2 Drum Teaching with Haptic Bracelets

The Haptic Bracelets operate rapidly enough to be used for real-time synchronisation between musicians. This enables a drum teacher (Fig. 11.3, right) and learner (Fig. 11.3, left) to both wear a set of bracelets, and for the learner to feel in the appropriate limb which limb the drummer uses to strike each drum, effectively in real time [3, 29]. The impacts felt by each limb are detected in fast sensors, signals are sent by Wi-Fi, and the system uses fast acting, precise vibrotactiles. Figure 11.4 shows the control interface for tap detection of each limb of the teacher’s devices mapping them to the learner’s bracelets. Consequently, communication delays are generally stable and under 10 ms. Taking into account the speed of sound in air, this means that synchronisation via the bracelets over a network can be as close as is generally achieved by musicians playing at distance of 3.5 m from each other—which is considered real time for most musical purposes. Depending on the quality of the Wi-Fi router and other system factors, beats can exceptionally be delayed or lost, but because the key working principle is entrainment, occasional small disturbances do not matter greatly.
Teaching in this way can be in person, over a distance, live or recorded, and one-to-one or one-to-many. Haptic Recordings can be played back later and slowed down for more detailed study, with limbs muted or isolated as needed.

11.3.3 Musician Coordination and Synchronisation

The mode of operation, outlined above, of the Haptic Bracelets has more general applications for musician coordination and synchronisation. The Bracelets can be used to address the problem that, in complex situations, crucial cues between musicians can be missed in the recording studio or live on stage.
Specific modes of use include silent count-ins, hierarchical or polyrhythmic click tracks, confirmation of correct device operation and inter-musician communication, and coordination generally. The idea of a silent count-in is straightforward and is not new: however, in the case of complex metres or complex polyrhythms, the bracelets allow silent hierarchical or polyrhythmic count-ins that explicitly enact up to four layers of the metre or polyrhythm simultaneously to be felt in the appropriate limb. Haptic count-ins and section announcements could variously be driven by a metronome or MIDI score on a DAW, driven by a tapping foot, or by other physical actions of a musician, sounded or silent. In device feedback mode, the correct operation of foot pedals and other controllers can be confirmed by haptic feedback—a sophisticated version of this idea has been explored extensively by [28].

11.3.4 Teaching Multi-limb Drum Patterns by Multi-limbed Haptic Cueing

The application of the Haptic Bracelets that we have explored most extensively is teaching multi-limb drum patterns (such as in Figs. 11.5, 11.6 and 11.7) using audio and haptic recordings, as studied in the next section.

11.4 Experimental Results

In this section, we review a series of experiments carried out to test the applicability of haptics for learning rhythm skills. These experiments use a variety of technological and methodological set-ups; earlier experiments used wired systems [15, 29] and sense what drums are hit and when, whereas our later systems are fully wireless and sense which limbs move and when [3, 16].

11.4.1 Supporting Learning of Rhythm Skills with the Haptic Drum Kit

Our first haptic guidance system was called the Haptic Drum Kit [15]. Its main aim was to support the learning of rhythm skills and multi-limb coordination while playing drums.
The haptic pulses sent to a particular limb indicate the exact moments at which notes should be played with that limb, on a specified part of the drum kit, i.e. hi-hat, ride cymbal, snare drum or kick drum. Because each rhythm is played repeatedly in a loop, the user can listen to and/or feel the pattern before trying to play along with one or all limbs. In other words, the aim of our design is deliberately not to orchestrate stimulus response but rather to foster entrainment.
The original Haptic Drum Kit system consists of the following: vibrotactiles attached to the wrists and ankles using velcro bands; a computer system that feeds signals to the haptic devices; a stereo audio system; and a MIDI drum kit, which is played by the person while wearing the haptic devices.
The MIDI drum kit is connected to the computer running sequencing and recording software (Logic Pro) which allows playback as well as accurate data collection. In the study, MIDI files encoding drum patterns (known as “guide tracks”) were played back by the sequencer to control the generation of audio output and synchronised haptic output. The vibrotactile output signals were generated through a programme written in Max and an Arduino board, which was connected to the actuators by wires.
Presentation was possible in one of the three following modes: audio only; audio plus haptics; or haptics only. The stereo audio system was used to play back both the sound created by playing the MIDI drum kit and the sound from the guide track, when required. In the study, the participants were also recorded on video from three different angles.
To explore what kinds of rhythmic patterns could be supported best by using haptic guidance, twenty reference rhythms were selected as stimuli, drawn from four broadly representative technical categories: (1) metrical rhythms, i.e. 8 beat and 16 beat; (2) rudimentary patterns that distribute continuous strokes across two limbs, e.g. the alternation of single and double strokes in the paradiddle (see Fig. 11.5); (3) figural rhythms, involving syncopation, based on the Cuban clave (see Fig. 11.6); and (4) polyrhythms, e.g. 2 versus 3, 3 versus 4 (see Fig. 11.7), 2 versus 5, 4 versus 5. The rhythms included patterns for two, three and four limbs.
Afterwards, a structured interview was carried out with each participant to explore their views on the Haptic Drum Kit and the three conditions used in the experiment. Of the five participants, four were beginners, while one had five years of experience drumming in rock bands and taking drumming lessons.
Although there were some interesting individual differences (see [15] for details), the results can be generally summarised as follows. All participants expressed an interest in using the Haptic Drum Kit again, and most found the system comfortable to wear. However, all participants found the audio clearer than the haptic presentation to attend to, and all found it easier to play in time with the audio than the haptic stimuli. Of the three forms of presentation (audio only, haptic only and audio plus haptic), all preferred audio plus haptic, indicating that the haptics were considered to have added value.
The vibrotactile drivers for this version of the Haptic Drum Kit (version 1) appeared to have three weaknesses for our purposes, according to feedback from the five participants in the study: (1) the haptics were not felt clearly enough, especially on the ankles; (2) the attack of the haptic pulses was somewhat blurred, making it difficult to recognise the precise timing of a note to be played; and (3) there was no relative emphasis of haptic pulses, which made it hard to clearly differentiate the beginning of the looping pattern.

11.4.2 Learning Multi-limb Rhythms with Improved Haptic Drum Kit

To address the weaknesses of the first version of the Haptic Drum Kit, an improved version was developed. This second version of the Haptic Drum Kit employs four C2 tactors7 as the vibrotactile devices. They use linear resonant actuators (LRAs) rather than the more common eccentric rotating mass (ERM) actuators, which allows tactors to deliver very clear haptic signals with very low start-up time (around 4 ms). Details on those actuator technologies can be found in Sect. 13.​2. These are secured to the limbs using elastic velcro bands. As with the earlier version of the system, a MIDI drum kit is used to play and record the drum sounds.
An experiment was carried out using this system with 16 participants (eleven with varying degrees of drumming experience, five without) to see whether this version was more suitable for our purposes and to explore in more detail the effects of haptic guidance on learning of rhythms, for four different kinds of rhythmic stimuli that all require multi-limb coordination. These stimuli form a subset of the rhythms used in the previous study:
  • Linear rudiments (e.g. paradiddle);
  • Metrical rhythms (8 beat rock rhythms);
  • Figural rhythms, involving syncopation, based on the Cuban clave;
  • Polyrhythms, e.g. 2 versus 3, 3 versus 4, 2 versus 5, 4 versus 5.
After the playing sessions, questionnaires were used to gather participants’ feedback on the different conditions. During subsequent analysis, the participants’ performance was manually scored by an experienced percussionist in terms of accuracy and timing, and times were recorded for the moment at which a particular pattern was first attempted and when it was first played correctly.
The results of this study were very encouraging. They indicated that haptic stimuli can be used as a reasonable alternative for audio stimuli in drumming instruction for the various kinds of rhythms employed, achieving similar results in terms of learning speed, i.e. the time required to learn to play an exercise correctly. For accuracy, there were individual differences which seemed related to the participants’ previous experience in drumming and playing along with metronomes.
For less experienced drummers, accuracy was highest in the haptic condition and lowest in the audio condition, while for the most experienced drummers there was little difference between conditions. Regarding timing, beginners performed best with audio plus haptics, whereas experts performed best with audio only. The data from the questionnaires showed that haptic guidance for multi-limbed drumming was generally well liked, and given a choice between audio, haptic or both audio and haptic presentation, 14 participants preferred audio plus haptic. Most participants enjoyed using the Haptic Drum Kit, found the tactors comfortable to wear, and all except one said they would like to use the system again.
Comparing different haptic devices, i.e. the vibrotactiles used in version (1) and the tactors used in version (2), the tactors provided better results, both in terms of observable performance and subjects’ attitudes.

11.4.3 Passive Learning of Multi-limb Rhythm Skills

To find out whether haptically supported learning of similar multi-limb rhythm skills could also take place while the learner is attending another task, away from the drums, an experiment was carried out to investigate the possibility of passive learning of rhythms while reading [11]. Fifteen people participated in the experiment (eight men and seven women), aged 15–51. Three were experienced drummers (with approximately 10 years of experience playing the drums), five had a little drumming experience, and seven had no experience with drumming.
The technology used in this study was an early version [29] of the Haptic Bracelets. For practical reasons, the system used for this study was wired and stationary, to ensure the maximum possible reliability of timing data. This version of the Haptic Bracelets employed C2 tactor vibrotactiles attached to each wrist and ankle, using elastic velcro bands. The tactors were driven by multichannel signals from a DAW.
The experimental procedure consisted of a pretest phase, a passive learning phase and a post-test phase, as follows. In the pretest phase, participants were asked to play a series of six rhythms (requiring multi-limb coordination, as in the previous study) on a drum kit, guided simply by audio recordings. These performances provided a base reference for later comparisons. During the following passive learning phase, away from the drum kit, participants were asked to carry out a 30-min reading comprehension test. Participants were asked to focus on getting the best possible scores on the comprehension test.
During the comprehension test, just two of the six rhythms from the set were haptically “played” (without audio) to each subject via the vibrotactiles attached to wrists and ankles. Different pairs of rhythms were chosen for different subjects, so that clear distinctions could be made in the next phase. Within that constraint, in order to present an adequate challenge for each subject, choices were made of more or less rhythmic complexity to reflect different levels of previous playing experience.
In each case, the two rhythms were played repeatedly, alternating every few minutes. In the post-test phase, subjects were asked to play again at the drum kit the complete set of rhythms from the pretest, including the two rhythms to which they had been passively exposed. Finally, a questionnaire was used to gain feedback from the participants about their experiences during the experiment and their attitudes towards the Haptic Bracelet technology.
The results from the participants’ subjective evaluations can be summarised as follows (for detail, and the complete set of responses from which a selection is provided here, see [11]).
Most participants thought that the technology helped them to understand rhythms and to play rhythms better, and most preferred haptic to audio to find out which limb to play when. Most participants indicated that they would prefer using a combination of haptics and audio for learning rhythms to either modality on its own.
Interesting quotes from participants in response to the open question “Are there things that you liked about using the technology in the training session?” included the following, all from different participants:
It helped to differentiate between the limbs, whereas using audio feedback it is often hard to separate limb function.
Clarity of the haptics. ‘seeing’ the repeated foot figure in the son clave.
Being able to flawlessly distinguish between which limb to use. The audio is more confusing.
The question “Are there things that you like about the haptic playback?” resulted in responses such as the following:
It makes the playing of complex patterns easier to understand.
Easier to concentrate on the particular rhythms within a polyrhythm (than audio only).
That you could easily feel which drums you needed to play when and how quickly it went on to the next beat.
The answers from participants to the question “Are there things that you don’t like about the haptic playback?” included the following:
repetition gets irritating ‘under the skin’
The ankle vibrations felt weak on me and I had to concentrate hard to feel them.
Just initially strapping on the legs. [Lack of] portability.
All quotes above are selected from [11].
In other words, there seems to be room for improvement in the feeling of the haptics and the straps, especially after longer use, the inconvenience of the wires and personally adjustable strength levels for the haptic signal for each limb. The last two points have already been addressed in more recent versions of the Haptic Bracelets, which are portable, wireless, and have individually adjustable levels.
As noted earlier, there is much research on the use of haptics for communicating different kinds of musical information, for example notifications [5], posture improvement [7], tempo synchronisation [8, 9], haptic guidance or augmentation in general [3032] (see also Chaps. 6, 8, 9, 12, 13 and Sect. 10.​3) and the effect of haptic feedback on quality perception and user experience [33, 34] (see also Sect. 5.3.2.2, Chaps. 6 and 7). However, in this section we focus principally on haptics for rhythm skills, particularly, though not exclusively, as regards multiple simultaneous streams of rhythms. We will group broadly representative strands of research in this area as follows:
  • haptic metronomes,
  • haptics applied to multiple parts of the body (or the whole body),
  • haptics for non-metronomic temporal sequencing.
Having reviewed the approaches used in this work, we then compare and contrast them with modes of use of the Haptic Bracelets (as considered in Sect. 11.3). The resultant contrasts help to illuminate various design dimensions for haptics for developing rhythm skills.
One straightforward use of haptics in developing rhythm skills is as haptic metronomes. Recently, commercial versions of haptic metronomes have come on the market.8 Giordano and Wanderley [9] demonstrated formally that musicians can reliably follow a tempo set by a haptic metronome. This research showed that deviation from target inter-onset interval was comparable between the auditory and the tactile modality.
Several projects have applied haptics to multiple areas of the body for music-related purposes, sometimes via specialised haptic garments [35] (see also Sect. 10.​3) and even via furniture [36]. However, the emphasis in these projects is generally not on multi-stream rhythm skills. In many cases, the focus is on exploring novel aesthetic haptic perceptual effects, such as in the case of [37, 33]. In some projects of this kind [36], the focus is strongly on Deaf culture,9 and on the use of crossmodal devices and sensory substitution [38] to convey musical information through sense of touch, particularly for the profoundly deaf. In this context, Fulford [39] has investigated the extent to which tonal intervals can be accurately communicated by touch. Jack et al. [37] have collaborated with Deaf arts activists to produce furniture that translates pitch, rhythm, loudness and timbre to whole body vibration in psychometrically well-informed ways.
Some work applying haptics to the whole body (or large parts of the body) may have some implications for improving skills related to multi-stream rhythms. An interesting example is a tension-based wearable vibroacoustic device by Yamakazi et al. [40]. This device uses a cord worn around the chest, whose tension is adjusted by DC motors directly driven by an amplified analogue audio signal. This system permits the communication of an acoustic signal with finely detailed bass clarity into the entire chest cavity. Users scored the experience favourably particularly in music with prominent bass drum parts. Although this system does not spatially separate multiple rhythms, its bass clarity may help wearers in separating low-pitched rhythm parts.
A contrasting system with clear potential relevance to skills multi-stream rhythm skills is MuSS-bits by Petry et al. [41]. Designed with deaf users in mind, this system uses wireless sensor–display pairs that map audio microphone signals more or less directly to the voltage applied to vibrotactiles, which can be attached anywhere on the body.
One strand of work has focused on haptics for temporal sequencing—particularly for monophonic rhythms and monophonic melodies—though recently the scope has widened [42, 43]. Huang et al. [44, 45] and Siem et al. [46, 47] carried out a series of studies looking at passive learning (i.e. learning without conscious attention) of tasks involving sequential key presses, such as typing or playing piano melodies. A lightweight wireless haptic system was developed for the purpose, with a fingerless glove containing one vibrotactile per finger. This system was used to teach sequences of finger movements to users, while they performed other tasks. A sequence of finger movements learned in this way, if subsequently repeated with the five fingers placed over five adjacent keys on a musical keyboard, serve to play a monophonic melody. Target melodies were typically restricted to five pitches, so that no horizontal movement of the hand (as opposed to vertical movement of the fingers) was needed. Sample melodies contained rests and notes of different durations. A study demonstrated that passive learning with audio and haptics combined was significantly more effective than audio only. A more recent study [47] involved passively training both hands simultaneously with material that was monophonic in the right hand but included simple repeating two note chords in the left hand. This work demonstrated that users may learn to play tunes for both left and right hand’s tunes at once via passive haptic learning. The work by Grindlay [42] focused on passive learning of monophonic drum rhythms, with a mechanical installation providing haptic guidance by automatically moving a single drumstick held by the learner. The results of this study showed that the system supported learning of rhythms which can be played with one hand.
A project that takes involuntary control of a learner’s movements to extremes is the Possessed Hand [48]. This system allows control of a user’s finger movements by applying electrical stimuli to the associated muscles using a belt with 28 electrode pads placed around the forearm. The makers suggest this system could be applied to musical applications, in particular learning correct hand posture for playing the piano or koto, but they mention there are issues to be considered related to reaction rate, accuracy and muscle fatigue. This research is highly unusual in terms of the test subjects’ comments, which include “Scary… just scary” and “I felt like my body was hacked” [48, p. 550].
As noted earlier, we will now compare and contrast the above work with various modes of use of the Haptic Bracelets in order to illuminate various dimensions of the interaction design space for the haptic support of rhythm skills.
One such design dimension contrasts metronomic cueing versus interpersonal rhythmic interaction. Commercial haptic metronomes are excellent tools for practising to a beat. Like the Haptic Bracelets, they can allow several musicians wirelessly to coordinate by sharing a common haptic metronomic beat or to be coordinated by cues from a MIDI score on a DAW. However, the current commercial haptic metronomes cannot track live limb movement so cannot, for example, deliver real-time multi-limb polyphonic drumming instruction from a drum teacher, as in the case of the Haptic Bracelets (Sect. 11.3.2). For many purposes, metronomic cueing is sufficient, but live intrapersonal entrainment affords additional expressive, musical and educational possibilities.
A second design dimension involves the contrast between discrete versus analog haptic mapping. By analog mapping, we refer to simple mapping of an audio signal—typically amplified and filtered—to a vibrotactile transducer, as opposed to representing rhythmic events by discrete pulses. In the case of [41] and much of the work aimed at whole body experience or Deaf culture, the haptic signals are typically more or less direct mappings of audio signals. By contrast, the Haptic Bracelets and commercial haptic metronomes use discrete haptic signals to represent events in rhythmic patterns. Discrete haptic signals need not be uniform—they can have different intensities, lengths and envelopes, for example to represent accents or textures when driven by a MIDI score. Analog haptics can communicate greater subtlety of texture, and continuous (as opposed to discrete) signals play important roles in deliberately designed haptic perceptual illusions [36]. However, for some purposes discrete pulses can give useful simplicity to the representation of discrete musical events.
Choices in the system used for sensing rhythmic events can have interesting design implications when representing polyphonic rhythms, especially when taking cues from a live drummer or teacher. MuSS-bits [41] offers an instructive contrast in this respect with the Haptic Bracelets. MuSS-bits uses analog wireless sensor–display pairs that map microphone signals directly to vibrotactiles. Such a system can readily be used to route different haptic signals onto different limbs, but a simple microphone is less well suited to detecting which limb is striking a drum and when, and better suited to detecting which drum has been struck. This can have advantages in situations where the same limb plays more than one drum, but can have disadvantages where, for example, two limbs alternate in their playing of a single drum (Fig. 11.5).
Yet another design dimension involves the choice of body location(s) when applying haptics. Different locations have different advantages for different applications. For example, as noted earlier, the tension-based system by Yamakazi et al. [40] allows clear communication through the chest of highly detailed bass vibrations, whereas Lewiston [43], Huang et al. [44, 45] and Siem et al. [46, 47] focus on individual fingers, and the Haptic Bracelets focus primarily on the limbs. MuSS-bits by contrast emphasises flexibility in choice of body locations for its wireless sensor–display pairs. Choice of body location for haptics can have a variety of subtle effects on the perception of haptic signals beyond the scope of this chapter—a general discussion of this issue can be found in [49].
Finally, there is an important difference between the work by Grindlay [42], Tamaki et al. [48] and our own, related to the dimension of control. Although very different, their systems are both able to physically control human movements, while in our work (and most other related work) the haptics only communicate signals to guide the user’s movement, and the user remains in control of all physical actions.

11.6 Conclusions

Music is an evolutionarily ancient human activity [50], and rhythm plays a fundamental role in it. Understanding and playing several rhythms simultaneously is one of the most challenging rhythm skills to learn. In this chapter, we have argued that of all the sensory modalities, touch has a special role to play in learning and teaching multi-limbed rhythms. This is because it allows different rhythmic components to be directly experienced simultaneously but separately in the relevant limbs. When experiencing rhythms haptically in this way, users find it relatively easy to mentally direct their attention to the sensations in any single limb or arbitrary combinations of limbs [11]. In many other musical applications of haptics, the user is simply called upon to be reactive, e.g. to respond to notifications, feedback or guidance, or to passively experience aesthetic effects. By contrast, the use of haptics in support of rhythm skills draws on sophisticated predictive skills, in particular the distinctively human capability of biological entrainment.
For the above reasons, we designed and built a series of systems, starting with Haptic Drum Kit and more recently the wireless Haptic Bracelets [3, 16]. We have used these systems to study new ways of learning rhythm skills. They all provide multiple streams of haptic signals to the body using vibrotactile devices around the wrists and ankles to guide the timed movement of these limbs in time with repeated rhythmic stimuli. The development of this work was inspired by research from various fields, including music education (e.g. Dalcroze Eurhythmics), musicology, music psychology and cognitive science, in particular the theories of biological entrainment, and neural resonance.
In this chapter, we have described several applications of the wireless Haptic Bracelets, including: (1) a portable Haptic Music Player, or “Haptic iPod”, which provides four channels of vibrotactile pulses that track drum parts in time with the music; (2) live interactive drum teaching with Haptic Bracelets worn by both teacher and learner, enabling the learner to feel in the appropriate limbs what the teacher is playing; (3) musician coordination and synchronisation, using the Haptic Bracelets to communicate musical cues such as count-ins, multichannel click tracks or section announcements in situations where audio may not be appropriate, such as recording studios or live on stage—these may be driven by a metronome, DAW or physical actions of a musician; and (4) teaching multi-limb drum patterns by multi-limbed haptic cueing.
Focusing on the last type of application, we have carried out three empirical studies with different versions of the Haptic Drum Kit and Haptic Bracelets to evaluate their usability and usefulness for this purpose. There was evidence that:
  • haptic stimuli can be used to learn to play a variety of multichannel rhythms, generally taking the same amount of time to learn as via audio alone,
  • there was an overwhelming preference for haptics plus audio (compared with audio alone) for learning multi-limb rhythms,
  • most participants preferred haptic to audio to find out which limb to play when,
  • novices in particular benefit from haptics, compared to people with more drumming experience,
  • participants considered that passive haptic playback of rhythms while reading helped them to better understand and play those rhythms.
Compared to related work on using haptics for music education, our approach seems to be unique in the focus on supporting the acquisition of rhythmic skills that involve multi-limb coordination by providing multichannel haptic signals to both wrists and ankles, although the Haptic Bracelet technology is flexible enough to support a range of other applications.
Several areas of further research are suggested by this work, with relevance to various disciplines, including music perception, cognition and production; music education; music and the deaf; human synchronisation; sports science; neuroscience; and digital health. More empirical studies are needed to better understand factors that may affect the learning of multi-limb rhythm skills, including:
  • different locations for placing haptic transducers on the body,
  • different strategies for haptically separating multi-limb drum parts (e.g. by drum versus by limb),
  • different vibrotactile technology,
  • analog versus discrete haptic encodings of rhythms,
  • the optimisation of discrete haptic “timbres” and intensities,
  • conditions promoting active versus passive haptic learning.
More attention is needed to factors such as different levels of drumming experience; the selection of rhythms and types of guidance provided (audio, haptic, visual or combinations). Better techniques are needed for automated analyses of drumming performance, characterising timing and accuracy in coordination of the limbs. We need to better understand the interplay between cognitive (e.g. symbolic) and embodied (e.g. Haptic Bracelets) approaches to internalising multiple simultaneous rhythms. Other directions for future work include investigating music-teaching applications that make use of the increased level of interactivity between teachers and learners provided by systems such as the latest version of the Haptic Bracelets. These systems may have particular relevance for deaf musicians. Finally, more work is needed on applications of the Haptic Bracelets in therapeutic settings in the health domain, e.g. combining musical stimuli with haptic guidance to support rehabilitation of walking skills for survivors of stroke and other neurological conditions.
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made. The images or other third party material in this book are included in the book's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the book's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.
Footnotes
1
Inertial measurement units typically combine accelerometers, gyroscopes and magnetometers.
 
2
In the present chapter, the term “vibrotactile” is often used as a noun to mean “vibrotactile actuator”.
 
3
A now discontinued Wi-Fi solution.
 
4
The band the Eurythmics was named after this educational approach.
 
5
Interestingly, in some special cases, a useful educational strategy can be to shift the memorisation load for multi-stream rhythms in the other direction, for example from limb movement onto language processing, e.g. by using linguistic mnemonics [11].
 
6
Digital Audio Workstation: A software programme for recording, editing and producing audio content.
 
7
https://​www.​eaiinfo.​com/​tactor-landing/​ (last accessed on November 8, 2017).
 
8
For example, the Soundbrenner Pulse http://​www.​soundbrenner.​com and the Peterson BodyBeat Pulse https://​www.​petersontuners.​com/​shop/​Metronomes/​ (last accessed on November 8, 2017).
 
9
Deaf culture (with a capital D) refers to a set of cultural values, behaviours and traditions associated with deafness viewed as a distinctive and valuable human experience, as opposed to a disability.
 
Literature
1.
go back to reference Georgiou, T., Holland, S., van der Linden, J., Tetley, J., Stockley, R., Donaldson, G., Garbutt, L., Pinzone, L., Grassely, F., Deleaye, K.: A blended user-centred design study for wearable haptic gait rehabilitation following hemiparetic stroke. In: Proceedings of the IEEE International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth). Istanbul, Turkey (2015) Georgiou, T., Holland, S., van der Linden, J., Tetley, J., Stockley, R., Donaldson, G., Garbutt, L., Pinzone, L., Grassely, F., Deleaye, K.: A blended user-centred design study for wearable haptic gait rehabilitation following hemiparetic stroke. In: Proceedings of the IEEE International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth). Istanbul, Turkey (2015)
2.
go back to reference Visi, F., Georgiou, T., Holland, S., Pinzone, O., Donaldson, G., Tetley, J.: Assessing the accuracy of an algorithm for the estimation of spatial gait parameters using inertial measurement units: application to healthy subject and hemiparetic stroke survivor. In: Proceedings of the International Conference on Movement Computing (MOCO), ACM, New York, NY, USA (2017) Visi, F., Georgiou, T., Holland, S., Pinzone, O., Donaldson, G., Tetley, J.: Assessing the accuracy of an algorithm for the estimation of spatial gait parameters using inertial measurement units: application to healthy subject and hemiparetic stroke survivor. In: Proceedings of the International Conference on Movement Computing (MOCO), ACM, New York, NY, USA (2017)
3.
go back to reference Holland, S., Wright, R., Wing, A., Crevoisier, T., Hödl, O., Canelli, M.:. A Gait rehabilitation pilot study using tactile cueing following hemiparetic stroke. In: Proceedings of the IEEE International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth), pp. 402–405. Oldenburg, Germany (2014) Holland, S., Wright, R., Wing, A., Crevoisier, T., Hödl, O., Canelli, M.:. A Gait rehabilitation pilot study using tactile cueing following hemiparetic stroke. In: Proceedings of the IEEE International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth), pp. 402–405. Oldenburg, Germany (2014)
5.
go back to reference Schumacher, M., Giordano, M., Wanderley, M., Ferguson, S.: Vibrotactile notification for live electronics performance: a prototype system. In: Proceedings of the International Symposium on Computer Music Multidisciplinary Research (CMMR), pp. 516–525. Marseille, France (2013) Schumacher, M., Giordano, M., Wanderley, M., Ferguson, S.: Vibrotactile notification for live electronics performance: a prototype system. In: Proceedings of the International Symposium on Computer Music Multidisciplinary Research (CMMR), pp. 516–525. Marseille, France (2013)
6.
go back to reference van der Linden, J., Johnson, R., Bird, J., Rogers, Y., Schoonderwaldt, E.: Buzzing to play: lessons learned from an in the wild study of real-time vibrotactile feedback. In: Proceedings of the International Conference on Human Factors in Computing Systems (CHI), Vancouver, BC, Canada (2011) van der Linden, J., Johnson, R., Bird, J., Rogers, Y., Schoonderwaldt, E.: Buzzing to play: lessons learned from an in the wild study of real-time vibrotactile feedback. In: Proceedings of the International Conference on Human Factors in Computing Systems (CHI), Vancouver, BC, Canada (2011)
7.
go back to reference Dalgleish, M., Spencer, S.: Postrum: developing good posture in trumpet players through directional haptic feedback. In: Proceedings of the Conference on Interdisciplinary Musicology (CIM), pp. 1–4. Berlin, Germany (2014) Dalgleish, M., Spencer, S.: Postrum: developing good posture in trumpet players through directional haptic feedback. In: Proceedings of the Conference on Interdisciplinary Musicology (CIM), pp. 1–4. Berlin, Germany (2014)
8.
go back to reference Armitage, J., Kia Ng.: mConduct: a multi-sensor interface for the capture and analysis of conducting gesture. In: Electronic Visualisation in Arts and Culture, pp. 153–165 Springer, London (2013) Armitage, J., Kia Ng.: mConduct: a multi-sensor interface for the capture and analysis of conducting gesture. In: Electronic Visualisation in Arts and Culture, pp. 153–165 Springer, London (2013)
9.
go back to reference Giordano, M., Wanderley, M.M.: Follow the tactile metronome: vibrotactile stimulation for tempo synchronisation in music performance. In: Proceedings of the Sound and Music Computing Conference (SMC), Maynooth, Ireland (2015) Giordano, M., Wanderley, M.M.: Follow the tactile metronome: vibrotactile stimulation for tempo synchronisation in music performance. In: Proceedings of the Sound and Music Computing Conference (SMC), Maynooth, Ireland (2015)
10.
go back to reference Sinclair, S.: Force-feedback hand controllers for musical interaction. MSc thesis, Music Technology Area, Schulich School of Music, McGill University, Montreal, Canada (2007) Sinclair, S.: Force-feedback hand controllers for musical interaction. MSc thesis, Music Technology Area, Schulich School of Music, McGill University, Montreal, Canada (2007)
11.
go back to reference Bouwer, A., Holland, S., Dalgleish, M.: The Haptic Bracelets: learning multi-limb rhythm skills from haptic stimuli while reading. In: Holland, S., Wilkie, K., Mulholland, P., Seago, A. (eds.) Music and Human-Computer Interaction, 101–122. Springer, London (2013) Bouwer, A., Holland, S., Dalgleish, M.: The Haptic Bracelets: learning multi-limb rhythm skills from haptic stimuli while reading. In: Holland, S., Wilkie, K., Mulholland, P., Seago, A. (eds.) Music and Human-Computer Interaction, 101–122. Springer, London (2013)
12.
go back to reference Bamberger, J.: The development of musical intelligence I: strategies for representing simple rhythms. In: Logo Memo 19, Epistemology and Learning Group, Massachusetts Institute of Technology, Boston, MA, USA (1975) Bamberger, J.: The development of musical intelligence I: strategies for representing simple rhythms. In: Logo Memo 19, Epistemology and Learning Group, Massachusetts Institute of Technology, Boston, MA, USA (1975)
13.
go back to reference Lerdahl, F., Jackendoff, R.: A Generative Theory of Tonal Music: MIT Press (1983) Lerdahl, F., Jackendoff, R.: A Generative Theory of Tonal Music: MIT Press (1983)
14.
go back to reference Arom, S.: African polyphony and polyrhythm. In: Musical Structure and Methodology. Cambridge University Press, England (1991) Arom, S.: African polyphony and polyrhythm. In: Musical Structure and Methodology. Cambridge University Press, England (1991)
15.
go back to reference Holland, S., Bouwer, A.J., Dalgleish, M., Hurtig, T.M.: Feeling the beat where it counts: fostering multi-limb rhythm skills with the haptic drum kit. Tangible and embedded interaction. In: Proceedings of the International Conference on Tangible, Embedded, and Embodied Interaction (TEI), pp. 21–28. Cambridge, MA, USA (2010) Holland, S., Bouwer, A.J., Dalgleish, M., Hurtig, T.M.: Feeling the beat where it counts: fostering multi-limb rhythm skills with the haptic drum kit. Tangible and embedded interaction. In: Proceedings of the International Conference on Tangible, Embedded, and Embodied Interaction (TEI), pp. 21–28. Cambridge, MA, USA (2010)
17.
go back to reference Juntunen, M.L.: Embodiment in Dalcroze Eurhythmics. Ph.D. thesis, University of Oulu, Finland (2004) Juntunen, M.L.: Embodiment in Dalcroze Eurhythmics. Ph.D. thesis, University of Oulu, Finland (2004)
18.
go back to reference London, J.: Hearing in Time: Psychological Aspects of Musical Meter. Oxford University Press (2004)CrossRef London, J.: Hearing in Time: Psychological Aspects of Musical Meter. Oxford University Press (2004)CrossRef
19.
go back to reference Dourish, P.: Where The Action Is: The Foundations of Embodied Interaction. MIT Press, Cambridge (2001) Dourish, P.: Where The Action Is: The Foundations of Embodied Interaction. MIT Press, Cambridge (2001)
20.
go back to reference Thompson, E.: Sensorimotor subjectivity and the enactive approach to experience. Phenomenol. Cogn. Sci. 4(4), 407–427 (2005)MathSciNetCrossRef Thompson, E.: Sensorimotor subjectivity and the enactive approach to experience. Phenomenol. Cogn. Sci. 4(4), 407–427 (2005)MathSciNetCrossRef
21.
go back to reference O’Regan, K., Noe, A.: A sensorimotor account of vision and visual consciousness. Behav. Brain Sci. 24(5), 883–917 (2001) O’Regan, K., Noe, A.: A sensorimotor account of vision and visual consciousness. Behav. Brain Sci. 24(5), 883–917 (2001)
22.
go back to reference Clayton M., Sager R.,Will, U.: In time with the music: the concept of entrainment and its significance for ethnomusicology. ESEM Count. Point 1 (2004) Clayton M., Sager R.,Will, U.: In time with the music: the concept of entrainment and its significance for ethnomusicology. ESEM Count. Point 1 (2004)
23.
go back to reference Angelis, V., Holland, S., Upton, P.J., Clayton, M.: Testing a computational model of rhythm perception using polyrhythmic stimuli. J. New Music Res. 42(1), 47–60 (2013)CrossRef Angelis, V., Holland, S., Upton, P.J., Clayton, M.: Testing a computational model of rhythm perception using polyrhythmic stimuli. J. New Music Res. 42(1), 47–60 (2013)CrossRef
24.
go back to reference Winkler, I., Háden, G.P., Ladinig, O., Sziller, I., Honing, H.: Newborn infants detect the beat in music. In: Proceedings of the National Academy of Sciences, vol. 106(7), 2468–2471 (2009)CrossRef Winkler, I., Háden, G.P., Ladinig, O., Sziller, I., Honing, H.: Newborn infants detect the beat in music. In: Proceedings of the National Academy of Sciences, vol. 106(7), 2468–2471 (2009)CrossRef
25.
go back to reference Honing, H., Merchant, H., Háden, G.P., Prado, L., Bartolo, R.: Rhesus monkeys (Macaca mulatta) detect rhythmic groups in music, but not the beat. PLoS One 7(12) (2012)CrossRef Honing, H., Merchant, H., Háden, G.P., Prado, L., Bartolo, R.: Rhesus monkeys (Macaca mulatta) detect rhythmic groups in music, but not the beat. PLoS One 7(12) (2012)CrossRef
26.
go back to reference Patel, A.D., Iversen, J.R., Bregman, M.R., Schulz, I.: Experimental evidence for synchronisation to a musical beat in a nonhuman animal. Curr. Biol. 19(10), 827–830 (2009)CrossRef Patel, A.D., Iversen, J.R., Bregman, M.R., Schulz, I.: Experimental evidence for synchronisation to a musical beat in a nonhuman animal. Curr. Biol. 19(10), 827–830 (2009)CrossRef
27.
go back to reference Gutcheon, J.: Improvising rock piano. Consolidated Music Publishers, New York, NY, USA (1978) Gutcheon, J.: Improvising rock piano. Consolidated Music Publishers, New York, NY, USA (1978)
28.
go back to reference Michailidis, T.: On the hunt for feedback: vibrotactile feedback in interactive electronic music performances. Ph.D. thesis, Birmingham Conservatoire, UK (2016) Michailidis, T.: On the hunt for feedback: vibrotactile feedback in interactive electronic music performances. Ph.D. thesis, Birmingham Conservatoire, UK (2016)
29.
go back to reference Bouwer, A., Dalgleish, M., Holland, S.: The haptic iPod: passive learning of multi-limb rhythm skills. In: Workshop ‘When Words Fail: What Can Music Interaction Tell Us About HCI?’ British Conference on Human-Computer Interaction, Newcastle, UK (2011) Bouwer, A., Dalgleish, M., Holland, S.: The haptic iPod: passive learning of multi-limb rhythm skills. In: Workshop ‘When Words Fail: What Can Music Interaction Tell Us About HCI?’ British Conference on Human-Computer Interaction, Newcastle, UK (2011)
30.
go back to reference Altinsoy, E.: The importance of the haptic feedback in musical practice—can a pianist distinguish a Steinway through listening?. In: Proceedings of the INTER-NOISE Conference Procedia, Institute of Noise Control Engineering vol. 253, issue 5 (2016) Altinsoy, E.: The importance of the haptic feedback in musical practice—can a pianist distinguish a Steinway through listening?. In: Proceedings of the INTER-NOISE Conference Procedia, Institute of Noise Control Engineering vol. 253, issue 5 (2016)
31.
go back to reference Knutzen, H., Kvifte, T., Wanderley, M.M.: Vibrotactile feedback for an open air music controller. In: International Symposium on Computer Music Modeling and Retrieval (CMMR), pp. 41–57 (2013) Knutzen, H., Kvifte, T., Wanderley, M.M.: Vibrotactile feedback for an open air music controller. In: International Symposium on Computer Music Modeling and Retrieval (CMMR), pp. 41–57 (2013)
32.
go back to reference O’Modhrain, S.: Playing by feel: incorporating haptic feedback into computer-based musical instruments. Ph.D. thesis, Stanford University, USA (2000) O’Modhrain, S.: Playing by feel: incorporating haptic feedback into computer-based musical instruments. Ph.D. thesis, Stanford University, USA (2000)
33.
go back to reference Giordano, M., Hattwick, I., Franco, I., Egloff, D., Frid, E., Lamontagne, V., TeZ, C., Salter, C., Wanderley, M.: Design and implementation of a whole-body haptic suit for “Ilinx”, a multisensory art installation. In: Proceedings of the Sound and Music Computing Conference (SMC), pp. 169–175. Maynooth, Ireland (2015) Giordano, M., Hattwick, I., Franco, I., Egloff, D., Frid, E., Lamontagne, V., TeZ, C., Salter, C., Wanderley, M.: Design and implementation of a whole-body haptic suit for “Ilinx”, a multisensory art installation. In: Proceedings of the Sound and Music Computing Conference (SMC), pp. 169–175. Maynooth, Ireland (2015)
34.
go back to reference Nanayakkara, S., Taylor, E., Wyse, L., Ong, S.H.: An enhanced musical experience for the deaf: design and evaluation of a music display and a haptic chair. In: Proceedings of the International Conference on Human Factors in Computing Systems (CHI), pp. 337–346, ACM (2009) Nanayakkara, S., Taylor, E., Wyse, L., Ong, S.H.: An enhanced musical experience for the deaf: design and evaluation of a music display and a haptic chair. In: Proceedings of the International Conference on Human Factors in Computing Systems (CHI), pp. 337–346, ACM (2009)
35.
go back to reference Giordano, M.: Vibrotactile feedback and stimulation in music performance. Ph.D. thesis, McGill University, Montréal, Québec, Canada (2016) Giordano, M.: Vibrotactile feedback and stimulation in music performance. Ph.D. thesis, McGill University, Montréal, Québec, Canada (2016)
36.
go back to reference Jack, R., McPherson, A., Stockman, T.: Designing tactile musical devices with and for deaf users: a case study. Proceedings of the International Conference on the Multimodal Experience of Music, Sheffield, UK (2015) Jack, R., McPherson, A., Stockman, T.: Designing tactile musical devices with and for deaf users: a case study. Proceedings of the International Conference on the Multimodal Experience of Music, Sheffield, UK (2015)
37.
go back to reference Hayes, L.: Skin music (2012): an audio-haptic composition for ears and body. In: Proceedings of the ACM SIGCHI Conference on Creativity and Cognition. ACM (2015) Hayes, L.: Skin music (2012): an audio-haptic composition for ears and body. In: Proceedings of the ACM SIGCHI Conference on Creativity and Cognition. ACM (2015)
38.
go back to reference Bird, J., Holland, S., Marshall, P., Rogers,Y., Clark, A.: Feel the force: using tactile technologies to investigate the extended mind. In: Proceedings of the Workshop on Devices that Alter Perception (DAP), pp. 1–4 Seoul, South Korea (2008) Bird, J., Holland, S., Marshall, P., Rogers,Y., Clark, A.: Feel the force: using tactile technologies to investigate the extended mind. In: Proceedings of the Workshop on Devices that Alter Perception (DAP), pp. 1–4 Seoul, South Korea (2008)
39.
go back to reference Fulford, R.: Interactive performance for musicians with a hearing impairment. Ph.D. thesis, Manchester Metropolitan University, UK (2013) Fulford, R.: Interactive performance for musicians with a hearing impairment. Ph.D. thesis, Manchester Metropolitan University, UK (2013)
40.
go back to reference Yamazaki, Y., Hironori M., Shoichi H.: Tension-based wearable vibroacoustic device for music appreciation. In: Proceedings of the International Conference on Human Haptic Sensing and Touch Enabled Computer Applications. Springer International Publishing (2016)CrossRef Yamazaki, Y., Hironori M., Shoichi H.: Tension-based wearable vibroacoustic device for music appreciation. In: Proceedings of the International Conference on Human Haptic Sensing and Touch Enabled Computer Applications. Springer International Publishing (2016)CrossRef
41.
go back to reference Petry, B., Illandara, T., Nanayakkara, S.: MuSS-bits: sensor-display blocks for deaf people to explore musical sounds. In: Proceedings of the Australian Conference on Computer-Human Interaction (OzCHI), pp. 72–80 (2016) Petry, B., Illandara, T., Nanayakkara, S.: MuSS-bits: sensor-display blocks for deaf people to explore musical sounds. In: Proceedings of the Australian Conference on Computer-Human Interaction (OzCHI), pp. 72–80 (2016)
42.
go back to reference Grindlay G.: Haptic guidance benefits musical motor learning. In: Proceedings of the Symposium on Haptic Interfaces for Virtual Environments and Teleoperator Systems, pp. 13–14. Reno, Nevada, USA (2008) Grindlay G.: Haptic guidance benefits musical motor learning. In: Proceedings of the Symposium on Haptic Interfaces for Virtual Environments and Teleoperator Systems, pp. 13–14. Reno, Nevada, USA (2008)
43.
go back to reference Lewiston, C.: MaGKeyS: A Haptic Guidance Keyboard System for Facilitating Sensorimotor Training and Rehabilitation. Ph.D. thesis. MIT Media Laboratory (2008) Lewiston, C.: MaGKeyS: A Haptic Guidance Keyboard System for Facilitating Sensorimotor Training and Rehabilitation. Ph.D. thesis. MIT Media Laboratory (2008)
44.
go back to reference Huang, K., Do, E.Y., Starner, T.: PianoTouch: a wearable haptic piano instruction system for passive learning of piano skills. Proceedings of the IEEE International Symposium Wear Computing pp. 41–44. Pittsburgh, PA, USA (2008) Huang, K., Do, E.Y., Starner, T.: PianoTouch: a wearable haptic piano instruction system for passive learning of piano skills. Proceedings of the IEEE International Symposium Wear Computing pp. 41–44. Pittsburgh, PA, USA (2008)
45.
go back to reference Huang, K., Starner, T., Do, E., Weiberg, G., Kohlsdorf, D., Ahlrichs, C., Leibrandt, R.: Mobile music touch: mobile tactile stimulation for passive learning. In: Proceedings of the International Conference on Human Factors in Computing Systems (CHI), pp. 791–800. Atlanta, GA, USA (2010) Huang, K., Starner, T., Do, E., Weiberg, G., Kohlsdorf, D., Ahlrichs, C., Leibrandt, R.: Mobile music touch: mobile tactile stimulation for passive learning. In: Proceedings of the International Conference on Human Factors in Computing Systems (CHI), pp. 791–800. Atlanta, GA, USA (2010)
46.
go back to reference Seim, C.E., Quigley, D., Starner, T.E.: Passive haptic learning of typing skills facilitated by wearable computers. In: Extended Abstracts on Human Factors in Computing Systems (CHI), pp. 2203–2208. Toronto, Canada (2014) Seim, C.E., Quigley, D., Starner, T.E.: Passive haptic learning of typing skills facilitated by wearable computers. In: Extended Abstracts on Human Factors in Computing Systems (CHI), pp. 2203–2208. Toronto, Canada (2014)
47.
go back to reference Seim, C., Estes, T., Starner, T.: Towards passive haptic learning of piano songs. In: Proceedings of the IEEE World Haptics Conference (WHC), pp. 445–450. Evanston, IL, USA (2015) Seim, C., Estes, T., Starner, T.: Towards passive haptic learning of piano songs. In: Proceedings of the IEEE World Haptics Conference (WHC), pp. 445–450. Evanston, IL, USA (2015)
48.
go back to reference Tamaki, E., Miyaki, T., Rekimoto, J.: PossessedHand: techniques for controlling human hands using electrical muscles stimuli. In: Proceedings of the International Conference on Human Factors in Computing Systems (CHI), pp. 543–552. Vancouver, BC, USA (2011) Tamaki, E., Miyaki, T., Rekimoto, J.: PossessedHand: techniques for controlling human hands using electrical muscles stimuli. In: Proceedings of the International Conference on Human Factors in Computing Systems (CHI), pp. 543–552. Vancouver, BC, USA (2011)
49.
go back to reference Karuei, I., MacLean, K. E., Foley-Fisher, Z., MacKenzie, R., Koch, S., El-Zohairy, M.: Detecting vibrations across the body in mobile contexts. In: Proceedings of the International Conference on Human Factors in Computing Systems (CHI), pp. 3267–3276. Vancouver, BC, USA (2011) Karuei, I., MacLean, K. E., Foley-Fisher, Z., MacKenzie, R., Koch, S., El-Zohairy, M.: Detecting vibrations across the body in mobile contexts. In: Proceedings of the International Conference on Human Factors in Computing Systems (CHI), pp. 3267–3276. Vancouver, BC, USA (2011)
50.
go back to reference Holland, S, Wilkie, K, Mulholland, P, Seago, A.: Music interaction: understanding music and human-computer interaction. In: Music and Human-Computer Interaction, pp. 1–28. Springer, London, UK (2013)CrossRef Holland, S, Wilkie, K, Mulholland, P, Seago, A.: Music interaction: understanding music and human-computer interaction. In: Music and Human-Computer Interaction, pp. 1–28. Springer, London, UK (2013)CrossRef
Metadata
Title
Haptics for the Development of Fundamental Rhythm Skills, Including Multi-limb Coordination
Authors
Simon Holland
Anders Bouwer
Oliver Hödl
Copyright Year
2018
DOI
https://doi.org/10.1007/978-3-319-58316-7_11