Skip to main content

Über dieses Buch

This agenda-setting book presents state of the art research in Music and Human-Computer Interaction (also known as ‘Music Interaction’). Music Interaction research is at an exciting and formative stage. Topics discussed include interactive music systems, digital and virtual musical instruments, theories, methodologies and technologies for Music Interaction. Musical activities covered include composition, performance, improvisation, analysis, live coding, and collaborative music making. Innovative approaches to existing musical activities are explored, as well as tools that make new kinds of musical activity possible. Music and Human-Computer Interaction is stimulating reading for professionals and enthusiasts alike: researchers, musicians, interactive music system designers, music software developers, educators, and those seeking deeper involvement in music interaction. It presents the very latest research, discusses fundamental ideas, and identifies key issues and directions for future work.



Chapter 1. Music Interaction: Understanding Music and Human-Computer Interaction

We introduce, review and analyse recent research in Music and Human-Computer Interaction (HCI), also known as Music Interaction. After a general overview of the discipline, we analyse the themes and issues raised by the other 15 chapters of this book, each of which presents recent research in this field. The bulk of this chapter is organised as an FAQ. Topics include: the scope of research in Music Interaction; the role of HCI in Music Interaction; and conversely, the role of Music Interaction in HCI. High-level themes include embodied cognition, spatial cognition, evolutionary interaction, gesture, formal language, affective interaction, and methodologies from social science. Musical activities covered include performance, composition, analysis, collaborative music making, and human and machine improvisation. Specific issues include: whether Music Interaction should be easy; what can be learned from the experience of being “in the groove”, and what can be learned from the commitment of musical amateurs. Broader issues include: what Music Interaction can offer traditional instruments and musical activities; what relevance it has for domains unconnected with music; and ways in which Music Interaction can enable entirely new musical activities.
Simon Holland, Katie Wilkie, Paul Mulholland, Allan Seago

Chapter 2. Should Music Interaction Be Easy?

A fundamental assumption in the fields of human-computer interaction and usability studies is that interfaces should be designed for ease of use, with a few exceptions such as the trade-off with long-term power. In this chapter it is argued that in music interaction the situation is far more complex, with social, technical, artistic, and psychological reasons why difficulty is in some cases a good thing, and in other cases a necessary evil. Different aspects of static and time-varying difficulty in music interaction are categorised. Some specific areas in which difficulty seems to be inextricably linked to positive aspects of music interaction are described. This is followed by discussion of some areas in which difficulty is undesirable and, perhaps, avoidable. Examples are drawn from music interaction research in general and from other chapters of this book in particular.
James McDermott, Toby Gifford, Anders Bouwer, Mark Wagy

Chapter 3. Amateur Musicians, Long-Term Engagement, and HCI

Musical instruments have a property of long-term engagement: people frequently become so engaged with them that they practice and play them for years, despite receiving no compensation other than enjoyment. We examine this phenomenon by analysing how the intrinsic motives mastery, autonomy, and purpose are built into the design of musical instruments; because, according to the self-determination theory of motivation, these three motives impact whether an activity might be found enjoyable. This analysis resulted in the identification of seven abstract qualities, inherent to the activity of music making or to the design of musical instruments, which contribute to the three intrinsic motives. These seven qualities can be treated as heuristics for the design of human-computer interfaces that have long-term engagement. These heuristics can be used throughout the design process, from the preliminary stage of idea generation to the evaluation stage of finished prototypes. Interfaces with instrument-like long-term engagement would be useful in many applications, both inside and outside the realm of music: they seem particularly suited for applications based on the attainment of long-term goals, which can be found in fields such as physical fitness, rehabilitation, education, and many others. In this chapter, we discuss an interface prototype we created and its pending evaluation. This interface, a rehabilitative rhythm game, serves as a case study showing how the heuristics might be used during the design process.
Isaac Wallis, Todd Ingalls, Ellen Campana, Catherine Vuong

Chapter 4. Affective Musical Interaction: Influencing Users’ Behaviour and Experiences with Music

In Human-Computer Interaction (HCI) use of the auditory channel normally involves communicating information to users in the form of short, auditory messages. Given the recent trend of HCI research towards incorporating experiential objectives, we propose that the auditory channel could also be exploited for affective intent. In particular, music could be integrated within interactive technologies as a vehicle to influence users’ behaviour and their experiences. This chapter describes some of the research conducted from other fields that already embrace the affective characteristic of music within their context. The limited amount of research exploiting music affectively in an HCI environment is discussed; including a review of our previous work involving Ambient Music Email (AME), an affective musical extension for email clients. By reflecting on how other subjects investigate the affective nature of music, this chapter aims to show that the HCI field is falling behind and inspire further work in this area. In fact, there are a wide variety of potential motivations for working with affective musical interaction, with a vast realm of potential research avenues, some of which are proposed here.
Anna Bramwell-Dicks, Helen Petrie, Alistair D. N. Edwards, Christopher Power

Chapter 5. Chasing a Feeling: Experience in Computer Supported Jamming

Improvisational group music-making, informally known as ‘jamming’, has its own cultures and conventions of musical interaction. One characteristic of this interaction is the primacy of the experience over the musical artefact—in some sense the sound created is not as important as the feeling of being ‘in the groove’. As computing devices infiltrate creative, open-ended task domains, what can Human-Computer Interaction (HCI) learn from jamming? How do we design systems where the goal is not an artefact but a felt experience? This chapter examines these issues in light of an experiment involving ‘Viscotheque’, a novel group music-making environment based on the iPhone.
Ben Swift

Chapter 6. The Haptic Bracelets: Learning Multi-Limb Rhythm Skills from Haptic Stimuli While Reading

The Haptic Bracelets are a system designed to help people learn multi-limbed rhythms (which involve multiple simultaneous rhythmic patterns) while they carry out other tasks. The Haptic Bracelets consist of vibrotactiles attached to each wrist and ankle, together with a computer system to control them. In this chapter, we report on an early empirical test of the capabilities of this system, and consider design implications. In the pre-test phase, participants were asked to play a series of multi-limb rhythms on a drum kit, guided by audio recordings. Participants’ performances in this phase provided a base reference for later comparisons. During the following passive learning phase, away from the drum kit, just two rhythms from the set were silently ‘played’ to each subject via vibrotactiles attached to wrists and ankles, while participants carried out a 30-min reading comprehension test. Different pairs of rhythms were chosen for different subjects to control for effects of rhythm complexity. In each case, the two rhythms were looped and alternated every few minutes. In the final phase, subjects were asked to play again at the drum kit the complete set of rhythms from the pre-test, including, of course, the two rhythms to which they had been passively exposed. Pending analysis of quantitative data focusing on accuracy, timing, number of attempts and number of errors, in this chapter we present preliminary findings based on participants’ subjective evaluations. Most participants thought that the technology helped them to understand rhythms and to play rhythms better, and preferred haptic to audio to find out which limb to play when. Most participants indicated that they would prefer using a combination of haptics and audio for learning rhythms to either modality on its own. Replies to open questions were analysed to identify design issues, and implications for design improvements were considered.
Anders Bouwer, Simon Holland, Mat Dalgleish

Chapter 7. Piano Technique as a Case Study in Expressive Gestural Interaction

There is a longstanding disconnect between mechanical models of the piano, in which key velocity is the sole determinant of each note’s sound, and the subjective experience of trained pianists, who take a nuanced, multidimensional approach to physical gestures at the keyboard (commonly known as “touch”). We seek to peel back the abstraction of the key press as a discrete event, developing models of key touch that link qualitative musical intention to quantitative key motion. The interaction between performer and instrument (whether acoustic or electronic) can be considered a special case of human-machine interaction, and one that takes place on far different terms than ordinary human-computer interaction: a player’s physical gestures are often the result of intuitive, subconscious processes. Our proposed models will therefore aid the development of computer interfaces which connect with human users on an intuitive, expressive level, with applications within and beyond the musical domain.
Andrew P. McPherson, Youngmoo E. Kim

Chapter 8. Live Music-Making: A Rich Open Task Requires a Rich Open Interface

In live human-computer music-making, how can interfaces successfully support the openness, reinterpretation and rich signification often important in live (especially improvised) musical performance? We argue that the use of design metaphors can lead to interfaces which constrain interactions and militate against reinterpretation, while consistent, grammatical interfaces empower the user to create and apply their own metaphors in developing their performance. These metaphors can be transitory and disposable, yet do not represent wasted learning since the underlying grammar is retained. We illustrate this move with reflections from live coding practice, from recent visual and two-dimensional programming language interfaces, and from musical voice mapping research. We consider the integration of the symbolic and the continuous in the human-computer interaction. We also describe how our perspective is reflected in approaches to system evaluation.
Dan Stowell, Alex McLean

Chapter 9. A New Interaction Strategy for Musical Timbre Design

Sound creation and editing in hardware and software synthesizers presents usability problems and a challenge for HCI research. Synthesis parameters vary considerably in their degree of usability, and musical timbre itself is a complex and multidimensional attribute of sound. This chapter presents a user-driven search-based interaction style where the user engages directly with sound rather than with a mediating interface layer. Where the parameters of a given sound synthesis method do not readily map to perceptible sonic attributes, the search algorithm offers an alternative means of timbre specification and control. However, it is argued here that the method has wider relevance for interaction design in search domains which are generally well-ordered and understood, but whose parameters do not afford a useful or intuitive means of search.
Allan Seago

Chapter 10. Pulsed Melodic Processing – The Use of Melodies in Affective Computations for Increased Processing Transparency

Pulsed Melodic Processing (PMP) is a computation protocol that utilizes musically-based pulse sets (“melodies”) for processing – capable of representing the arousal and valence of affective states. Affective processing and affective input/output are key tools in artificial intelligence and computing. In the designing of processing elements (e.g. bits, bytes, floats, etc.), engineers have primarily focused on the processing efficiency and power. They then go on to investigate ways of making them perceivable by the user/engineer. However Human-Computer Interaction research – and the increasing pervasiveness of computation in our daily lives – supports a complementary approach in which computational efficiency and power are more balanced with understandability to the user/engineer. PMP allows a user to tap into the processing path to hear a sample of what is going on in that affective computation, as well as providing a simpler way to interface with affective input/output systems. This requires the developing of new approaches to processing and interfacing PMP-based modules. In this chapter we introduce PMP and examine the approach using three example: a military robot team simulation with an affective subsystem, a text affective-content estimation system, and a stock market tool.
Alexis Kirke, Eduardo Miranda

Chapter 11. Computer Musicking: HCI, CSCW and Collaborative Digital Musical Interaction

We are interested in the design of software to transform single user devices such as laptop computers into a platform for collaborative musical interaction. Our work draws on existing theories of group musical interaction and studies of collaboration in the workplace. This chapter explores the confluence of these domains, giving particular attention to challenges posed by the auditory nature of music and the open-ended characteristics of musical interaction. Our methodological approach is described and a study is presented which contrasts three interface designs for collaborative musical interaction. Significant results are discussed, showing that the different interface designs influenced the way groups structured their collaboration. We conclude by proposing several design implications for collaborative music software, and outline directions for future work.
Robin Fencott, Nick Bryan-Kinns

Chapter 12. Song Walker Harmony Space: Embodied Interaction Design for Complex Musical Skills

Tonal Harmony is widely considered to be the most technical and complex part of music theory. Consequently harmonic skills can be hard to acquire. Furthermore, experience of the flexible manipulation of harmony in real time generally requires the ability to play an instrument. Even for those with instrumental skills, it can be difficult to gain clear insight into harmonic abstractions. The above state of affairs gives rise to substantial barriers not only for beginners but also for many experienced musicians. To address these problems, Harmony Space is an interactive digital music system designed to give insight into a wide range of musical tasks in tonal harmony, ranging from performance and composition to analysis. Harmony Space employs a principled set of spatial mappings to offer fluid, precise, intuitive control of harmony. These mappings give rise to sensory-motor and music-theoretic affordances that are hard to obtain in any other way. As a result, harmonic abstractions are rendered amenable to concrete, visible control by simple spatial manipulation. In the language of conceptual metaphor theory, many relationships in tonal harmony become accessible to rapid, universal, low-level, robust human inference mechanisms using image schemata such as containment, contact, centre-periphery, and source-path-goal. This process is more rapid, and imposes far less cognitive load, than slow, abstract symbolic reasoning. Using the above principles, several versions of Harmony Space have been designed to exploit specific interaction styles for different purposes. We note some key variants, such as the desktop version, the camera tracked version, while focusing principally on the most recent version, Song Walker, which employs whole body interaction. Preliminary results from a study of the Song Walker system are outlined, in which both beginners and expert musicians undertook a range of musical tasks involving the performance, composition and analysis of music. Finally, we offer a discussion of the limitations of the current system, and outline directions for future work.
Anders Bouwer, Simon Holland, Mat Dalgleish

Chapter 13. Evolutionary and Generative Music Informs Music HCI—And Vice Versa

This chapter suggests a two-way influence between the field of evolutionary and generative music and that of human–computer interaction and usability studies. The interfaces used in evolutionary and generative music can be made more effective and more satisfying to use with the influence of the ideas, methods, and findings of human–computer interaction and usability studies. The musical representations which are a focus of evolutionary and generative music can enable new user-centric tools for mainstream music software. Some successful existing projects are described and some future work is proposed.
James McDermott, Dylan Sherry, Una-May O’Reilly

Chapter 14. Video Analysis for Evaluating Music Interaction: Musical Tabletops

There is little evaluation of musical tabletops for music performance, and current approaches tend to have little consideration of social interaction. However, in collaborative settings, social aspects such as coordination, communication, or musical engagement between collaborators are fundamental for a successful performance. After an overview of the use of video in music interaction research as a convenient method for understanding interaction between people and technology, we present three empirical examples of approaches to video analysis applied to musical tabletops; firstly, an exploratory approach to give informal insight towards understanding collaboration in new situations; secondly, a participatory design approach oriented to improve an interface design by getting feedback from the user experience; thirdly, a quantitative approach, towards understanding collaboration by considering frequencies of interaction events. The aim of this chapter is to provide a useful insight into how to evaluate musical tabletops using video as a data source. Furthermore, this overview can shed light on understanding shareable interfaces in a wider HCI context of group creativity and multi-player interaction.
Anna Xambó, Robin Laney, Chris Dobbyn, Sergi Jordà

Chapter 15. Towards a Participatory Approach for Interaction Design Based on Conceptual Metaphor Theory: A Case Study from Music Interaction

“Music Interaction” is the term for interaction design within the domain of music. In areas such as music, the ability to engage effectively in certain activities tends to be restricted to those who have acquired detailed knowledge of domain-specific theories, terminologies, concepts or processes. It can be challenging to design or enhance user interfaces for software able to support novices in these kinds of musical activities. One promising approach to this challenge involves translating musicians’ implicit domain knowledge into patterns known as conceptual metaphors, which are metaphorical extensions of recurring patterns of embodied experience applied to abstract domains, and using this information to inform interaction designs for music. This approach has been applied experimentally with some success to designing user interfaces. However, to the best of our knowledge, this present work is the first to consider in detail the use of Conceptual Metaphor Theory as a key component of a participatory design process. In this chapter we present a participatory approach to Music Interaction design based on the principles of Conceptual Metaphor Theory. We posit that such an approach will facilitate the development of innovative and intuitive interaction designs for both novices and experts alike.
Katie Wilkie, Simon Holland, Paul Mulholland

Chapter 16. Appropriate and Complementary Rhythmic Improvisation in an Interactive Music System

One of the roles that interactive music systems can play is to operate as real-time improvisatory agents in an ensemble. A key issue for such systems is how to generate improvised material that is musically appropriate, and complementary to the rest of the ensemble. This chapter describes some improvisation strategies employed by the Jambot (a recently developed interactive music system) that combine both imitative and ‘intelligent’ techniques. The Jambot uses three approaches to mediate between imitative and intelligent actions: (i) mode switching based on confidence of understanding, (ii) filtering and elaboration of imitative actions, and (iii) measured deviation from imitative action according to a salient parametrisation of the action space. In order to produce appropriate rhythms the Jambot operates from a baseline of transformed imitation, and utilises moments of confident understanding to deviate musically from this baseline. The Jambot’s intelligent improvisation seeks to produce complementary rhythms by manipulating the level of ambiguity present in the improvisation to maintain a balance between novelty and coherence.
Toby Gifford


Weitere Informationen