Skip to main content

2020 | Buch

Robotic Musicianship

Embodied Artificial Creativity and Mechatronic Musical Expression

insite
SUCHEN

Über dieses Buch

This book discusses the principles, methodologies, and challenges of robotic musicianship through an in-depth review of the work conducted at the Georgia Tech Center for Music Technology (GTCMT), where the concept was first developed. Robotic musicianship is a relatively new research field that focuses on the design and development of intelligent music-making machines. The motivation behind the field is to develop robots that not only generate music, but also collaborate with humans by listening and responding in an expressive and creative manner. This combination of human and machine creativity has the potential to surprise and inspire us to play, listen, compose, and think about music in new ways.

The book provides an in-depth view of the robotic platforms designed at the GTCMT Robotic Musicianship Group, including the improvisational robotic percussionists Haile and Shimon, the personal robotic companion Shimi, and a number of wearable robots, such as the Robotic Drumming Prosthesis, The Third Drumming Arm, and the Skywalker Piano Hand. The book discusses numerous research studies based on these platforms in the context of five main principles: Listen like a Human, Play Like a Machine, Be Social, Watch and Learn, and Wear It.

Inhaltsverzeichnis

Frontmatter
Chapter 1. Introduction
Abstract
The purpose of this book is to introduce the concepts that define robotic musicianship and the underlying technology that supports them.
Gil Weinberg, Mason Bretan, Guy Hoffman, Scott Driscoll
Chapter 2. Platforms—Georgia Tech’s Robotic Musicians
Abstract
In Chap. 1 we surveyed a wide range of sound production mechanisms that could allow musical robots to play percussive, stringed, wind, non-traditional and augmented musical instruments.
Gil Weinberg, Mason Bretan, Guy Hoffman, Scott Driscoll
Chapter 3. “Listen Like A Human”—Human-Informed Music Perception Models
Abstract
One of the main principle guidelines for our Robotic Musicianship research is to develop robots that can “listen like a human and play like a machine.” We would like our robots to be able to understand music as humans so they can connect with their co-players (“listen like a human”) but also surprise and inspire humans with novel music ideas and capabilities (“play like a machine”).
Gil Weinberg, Mason Bretan, Guy Hoffman, Scott Driscoll
Chapter 4. “Play Like A Machine”—Generative Musical Models for Robots
Abstract
In the previous chapter we discussed a number of approaches that allow Robotic Musicians to “Listen Like a Human” by modeling high level human musical percepts. In this chapter we focus on the second part of our Robotic Musicianship guideline—“Play Like a Machine”. While implementing “Listen Like a Human” modules in Robotic Musicians allow robots to connect to humans in a relatable manner, “Play Like A Machine” modules are aimed at generating novel musical outcomes that are humanly impossible in an effort to surprise and inspire human collaborators.
Gil Weinberg, Mason Bretan, Guy Hoffman, Scott Driscoll
Chapter 5. “Be Social”—Embodied Human-Robot Musical Interactions
Abstract
Embodiment has a significant effect on social human-robot interaction, from enabling fluent turn-taking between humans and robots [1] to humans’ positive perception of robotic conversants [2]. In Robotic Musicianship, embodiment and gestural musical interaction can provide social benefits that are not available with standard computer based interactive music [3, 4].
Gil Weinberg, Mason Bretan, Guy Hoffman, Scott Driscoll
Chapter 6. “Watch and Learn”—Computer Vision for Musical Gesture Analysis
Abstract
In the previous chapter we showed how human musicians can benefit from visual and physical cues that are afforded by robotic musicians. Similarly, robotic musicians can benefit by augmenting their own abilities through analyzing visual cues by humans. Like humans, robotic musicians can use vision to anticipate, coordinate and synchronize their music playing with human collaborators.
Gil Weinberg, Mason Bretan, Guy Hoffman, Scott Driscoll
Chapter 7. “Wear it”—Wearable Robotic Musicians
Abstract
Recent developments in wearable technology can help people with disabilities regain their lost capabilities, merging their biological body with robotic enhancements. Myoelectric prosthetic hands, for example, allow amputees to perform basic daily-life activities by sensing and analyzing electric activity from their residual limbs, which is then used to actuate a robotic hand. These new developments not only bring back lost functionalities, but can also provide humanly impossible capabilities, turning those who were considered disabled to become super-abled. The next frontier of Robotic Musicianship research at Georgia Tech focuses on the development of wearable robotic limbs that allow not only amputees, but able-bodied people as well, to play music like no human can, with virtuosity and speed that are humanly impossible. This chapter addresses the promises and challenges of the new frontier of wearable robotic musicians, from a Robotic Prosthetic Drumming Arm that contains a drumming stick with a “mind of its own”, to a “Third Arm” that augments able-bodied drummers, to the Skywalker Piano Hand that uses deep learning predictions from ultrasound muscle data to allow amputees to play the piano using dexterous and expressive finger gestures.
Gil Weinberg, Mason Bretan, Guy Hoffman, Scott Driscoll
Backmatter
Metadaten
Titel
Robotic Musicianship
verfasst von
Gil Weinberg
Mason Bretan
Guy Hoffman
Scott Driscoll
Copyright-Jahr
2020
Electronic ISBN
978-3-030-38930-7
Print ISBN
978-3-030-38929-1
DOI
https://doi.org/10.1007/978-3-030-38930-7

    Marktübersichten

    Die im Laufe eines Jahres in der „adhäsion“ veröffentlichten Marktübersichten helfen Anwendern verschiedenster Branchen, sich einen gezielten Überblick über Lieferantenangebote zu verschaffen.