Skip to main content
Top

2020 | OriginalPaper | Chapter

6. “Watch and Learn”—Computer Vision for Musical Gesture Analysis

Authors : Gil Weinberg, Mason Bretan, Guy Hoffman, Scott Driscoll

Published in: Robotic Musicianship

Publisher: Springer International Publishing

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

In the previous chapter we showed how human musicians can benefit from visual and physical cues that are afforded by robotic musicians. Similarly, robotic musicians can benefit by augmenting their own abilities through analyzing visual cues by humans. Like humans, robotic musicians can use vision to anticipate, coordinate and synchronize their music playing with human collaborators.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Footnotes
1
Biggs, Janet. “Re: Demos” Message to Richard Savery, 26 October, 2017. Email.
 
2
Biggs, Janet. “Re: Demos” Message to Richard Savery, 6 November, 2017. Email.
 
Literature
1.
go back to reference Johansson, Birger, and Christian Balkenius. 2006. An experimental study of anticipation in simple robot navigation. In Workshop on anticipatory behavior in adaptive learning systems, 365–378. Springer. Johansson, Birger, and Christian Balkenius. 2006. An experimental study of anticipation in simple robot navigation. In Workshop on anticipatory behavior in adaptive learning systems, 365–378. Springer.
2.
go back to reference Eyssel, Friederike, Dieta Kuchenbrandt, and Simon Bobinger. 2011. Effects of anticipated human-robot interaction and predictability of robot behavior on perceptions of anthropomorphism. In Proceedings of the 6th international conference on Human-robot interaction, 61–68. ACM. Eyssel, Friederike, Dieta Kuchenbrandt, and Simon Bobinger. 2011. Effects of anticipated human-robot interaction and predictability of robot behavior on perceptions of anthropomorphism. In Proceedings of the 6th international conference on Human-robot interaction, 61–68. ACM.
3.
go back to reference Gielniak, Michael J., and Andrea L Thomaz. 2011. Generating anticipation in robot motion. In 2011 RO-MAN. Gielniak, Michael J., and Andrea L Thomaz. 2011. Generating anticipation in robot motion. In 2011 RO-MAN.
4.
go back to reference Hoffman, Guy. 2010. Anticipation in human-robot interaction. In 2010 AAAI Spring Symposium Series. Hoffman, Guy. 2010. Anticipation in human-robot interaction. In 2010 AAAI Spring Symposium Series.
5.
go back to reference Wang, Zhikun, Christoph H. Lampert, Katharina Mulling, Bernhard Scholkopf, and Jan Peters. 2011. Learning anticipation policies for robot table tennis. In 2011 IEEE/RSJ international conference on intelligent robots and systems (IROS), 332–337. IEEE. Wang, Zhikun, Christoph H. Lampert, Katharina Mulling, Bernhard Scholkopf, and Jan Peters. 2011. Learning anticipation policies for robot table tennis. In 2011 IEEE/RSJ international conference on intelligent robots and systems (IROS), 332–337. IEEE.
6.
go back to reference Bradski, Gary, and Adrian Kaehler. 2008. Learning OpenCV: Computer vision with the OpenCV library. O’reilly. Bradski, Gary, and Adrian Kaehler. 2008. Learning OpenCV: Computer vision with the OpenCV library. O’reilly.
7.
go back to reference Puckette, Miller S., Miller S. Puckette Ucsd, Theodore Apel, et al. 1998. Real-time audio analysis tools for Pd and MSP. Puckette, Miller S., Miller S. Puckette Ucsd, Theodore Apel, et al. 1998. Real-time audio analysis tools for Pd and MSP.
8.
go back to reference Ghias, Asif, Jonathan Logan, David Chamberlin, and Brian C. Smith. 1995. Query by humming: Musical information retrieval in an audio database. In Proceedings of the third ACM international conference on Multimedia, 231–236. ACM. Ghias, Asif, Jonathan Logan, David Chamberlin, and Brian C. Smith. 1995. Query by humming: Musical information retrieval in an audio database. In Proceedings of the third ACM international conference on Multimedia, 231–236. ACM.
9.
go back to reference Lewis, Barbara E. 1988. The effect of movement-based instruction on first-and third-graders’ achievement in selected music listening skills. Psychology of Music 16 (2): 128–142.CrossRef Lewis, Barbara E. 1988. The effect of movement-based instruction on first-and third-graders’ achievement in selected music listening skills. Psychology of Music 16 (2): 128–142.CrossRef
10.
go back to reference Mitchell, Robert W., and Matthew C. Gallaher. 2001. Embodying music: Matching music and dance in memory. Music Perception 19 (1): 65–85.CrossRef Mitchell, Robert W., and Matthew C. Gallaher. 2001. Embodying music: Matching music and dance in memory. Music Perception 19 (1): 65–85.CrossRef
11.
go back to reference Phillips-Silver, Jessica, and Laurel J. Trainor. 2005. Feeling the beat: Movement influences infant rhythm perception. Science 308 (5727): 1430–1430.CrossRef Phillips-Silver, Jessica, and Laurel J. Trainor. 2005. Feeling the beat: Movement influences infant rhythm perception. Science 308 (5727): 1430–1430.CrossRef
12.
go back to reference Krumhansl, Carol L, and Diana Lynn Schenck. 1997. Can dance reflect the structural and expressive qualities of music? A perceptual experiment on Balanchine’s choreography of Mozart’s divertimento no. 15. Musicae Scientiae 1 (1): 63–85.CrossRef Krumhansl, Carol L, and Diana Lynn Schenck. 1997. Can dance reflect the structural and expressive qualities of music? A perceptual experiment on Balanchine’s choreography of Mozart’s divertimento no. 15. Musicae Scientiae 1 (1): 63–85.CrossRef
13.
go back to reference Sievers, Beau, Larry Polansky, Michael Casey, and Thalia Wheatley. 2013. Music and movement share a dynamic structure that supports universal expressions of emotion. Proceedings of the National Academy of Sciences 110 (1): 70–75.CrossRef Sievers, Beau, Larry Polansky, Michael Casey, and Thalia Wheatley. 2013. Music and movement share a dynamic structure that supports universal expressions of emotion. Proceedings of the National Academy of Sciences 110 (1): 70–75.CrossRef
14.
go back to reference Gazzola, Valeria, Lisa Aziz-Zadeh, and Christian Keysers. 2006. Empathy and the somatotopic auditory mirror system in humans. Current Biology 16 (18): 1824–1829.CrossRef Gazzola, Valeria, Lisa Aziz-Zadeh, and Christian Keysers. 2006. Empathy and the somatotopic auditory mirror system in humans. Current Biology 16 (18): 1824–1829.CrossRef
15.
go back to reference Grosse, Ernst. 1897. The beginnings of art, vol. 4. D. Appleton and Company. Grosse, Ernst. 1897. The beginnings of art, vol. 4. D. Appleton and Company.
16.
go back to reference Paradiso, Joseph A., and Hu, Eric. 1997. Expressive footwear for computer-augmented dance performance. In First international symposium on wearable computers, 1997. Digest of papers, 165–166. IEEE. Paradiso, Joseph A., and Hu, Eric. 1997. Expressive footwear for computer-augmented dance performance. In First international symposium on wearable computers, 1997. Digest of papers, 165–166. IEEE.
17.
go back to reference Paradiso, Joseph, and Flavia Sparacino. 1997. Optical tracking for music and dance performance. Optical 3-D measurement techniques IV, 11–18. Paradiso, Joseph, and Flavia Sparacino. 1997. Optical tracking for music and dance performance. Optical 3-D measurement techniques IV, 11–18.
18.
go back to reference Camurri, Antonio, Shuji Hashimoto, Matteo Ricchetti, Andrea Ricci, Kenji Suzuki, Riccardo Trocca, and Gualtiero Volpe. 2000. Eyesweb: Toward gesture and affect recognition in interactive dance and music systems. Computer Music Journal 24 (1): 57–69.CrossRef Camurri, Antonio, Shuji Hashimoto, Matteo Ricchetti, Andrea Ricci, Kenji Suzuki, Riccardo Trocca, and Gualtiero Volpe. 2000. Eyesweb: Toward gesture and affect recognition in interactive dance and music systems. Computer Music Journal 24 (1): 57–69.CrossRef
19.
go back to reference Aylward, Ryan, and Joseph A. Paradiso. Sensemble: A wireless, compact, multi-user sensor system for interactive dance. In Proceedings of the 2006 conference on new interfaces for musical expression, 134–139. IRCAM–Centre Pompidou. Aylward, Ryan, and Joseph A. Paradiso. Sensemble: A wireless, compact, multi-user sensor system for interactive dance. In Proceedings of the 2006 conference on new interfaces for musical expression, 134–139. IRCAM–Centre Pompidou.
20.
go back to reference Winkler, Todd. 1998. Motion-sensing music: Artistic and technical challenges in two works for dance. In Proceedings of the international computer music conference. Winkler, Todd. 1998. Motion-sensing music: Artistic and technical challenges in two works for dance. In Proceedings of the international computer music conference.
21.
go back to reference Samberg, Joshua, Armando Fox, and Maureen Stone. 2002. iClub, an interactive dance club. In ADJUNCT PROCEEDINGS, 73. Samberg, Joshua, Armando Fox, and Maureen Stone. 2002. iClub, an interactive dance club. In ADJUNCT PROCEEDINGS, 73.
22.
go back to reference Bretan, Mason, and Gil Weinberg. 2014. Chronicles of a robotic musical companion. In Proceedings of the 2014 conference on new interfaces for musical expression. University of London. Bretan, Mason, and Gil Weinberg. 2014. Chronicles of a robotic musical companion. In Proceedings of the 2014 conference on new interfaces for musical expression. University of London.
23.
go back to reference Bouguet, Jean-Yves. 2001. Pyramidal implementation of the affine lucas kanade feature tracker description of the algorithm. Intel Corporation 5. Bouguet, Jean-Yves. 2001. Pyramidal implementation of the affine lucas kanade feature tracker description of the algorithm. Intel Corporation 5.
24.
go back to reference Shi, Jianbo, and Carlo Tomasi. 1994. Good features to track. In Proceedings CVPR’94, 1994 IEEE computer society conference on computer vision and pattern recognition, 593–600. IEEE. Shi, Jianbo, and Carlo Tomasi. 1994. Good features to track. In Proceedings CVPR’94, 1994 IEEE computer society conference on computer vision and pattern recognition, 593–600. IEEE.
25.
go back to reference Jehan, Tristan, Paul Lamere, and Brian Whitman. 2010. Music retrieval from everything. In Proceedings of the international conference on Multimedia information retrieval, 245–246. ACM. Jehan, Tristan, Paul Lamere, and Brian Whitman. 2010. Music retrieval from everything. In Proceedings of the international conference on Multimedia information retrieval, 245–246. ACM.
26.
go back to reference Gouyon, Fabien, Anssi Klapuri, Simon Dixon, Miguel Alonso, George Tzanetakis, Christian Uhle, and Pedro Cano. 2006. An experimental comparison of audio tempo induction algorithms. IEEE Transactions on Audio, Speech, and Language Processing 14 (5): 1832–1844.CrossRef Gouyon, Fabien, Anssi Klapuri, Simon Dixon, Miguel Alonso, George Tzanetakis, Christian Uhle, and Pedro Cano. 2006. An experimental comparison of audio tempo induction algorithms. IEEE Transactions on Audio, Speech, and Language Processing 14 (5): 1832–1844.CrossRef
27.
go back to reference Sundram, J. 2013. Danceability and energy: Introducing echo nest attributes. Sundram, J. 2013. Danceability and energy: Introducing echo nest attributes.
28.
go back to reference Grunberg, David K., Alyssa M. Batula, Erik M. Schmidt, and Youngmoo E. Kim. Affective gesturing with music mood recognition. Grunberg, David K., Alyssa M. Batula, Erik M. Schmidt, and Youngmoo E. Kim. Affective gesturing with music mood recognition.
29.
go back to reference Baker, Simon, and Iain Matthews. 2004. Lucas-kanade 20 years on: A unifying framework. International Journal of Computer Vision 56 (3): 221–255.CrossRef Baker, Simon, and Iain Matthews. 2004. Lucas-kanade 20 years on: A unifying framework. International Journal of Computer Vision 56 (3): 221–255.CrossRef
30.
go back to reference Toussaint, Godfried. 2005. The Euclidean algorithm generates traditional musical rhythms. In BRIDGES: Mathematical connections in art, music and science, 1–25. Toussaint, Godfried. 2005. The Euclidean algorithm generates traditional musical rhythms. In BRIDGES: Mathematical connections in art, music and science, 1–25.
Metadata
Title
“Watch and Learn”—Computer Vision for Musical Gesture Analysis
Authors
Gil Weinberg
Mason Bretan
Guy Hoffman
Scott Driscoll
Copyright Year
2020
DOI
https://doi.org/10.1007/978-3-030-38930-7_6

Premium Partners