Skip to main content
Top

2020 | OriginalPaper | Chapter

5. “Be Social”—Embodied Human-Robot Musical Interactions

Authors : Gil Weinberg, Mason Bretan, Guy Hoffman, Scott Driscoll

Published in: Robotic Musicianship

Publisher: Springer International Publishing

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Embodiment has a significant effect on social human-robot interaction, from enabling fluent turn-taking between humans and robots [1] to humans’ positive perception of robotic conversants [2]. In Robotic Musicianship, embodiment and gestural musical interaction can provide social benefits that are not available with standard computer based interactive music [3, 4].

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Footnotes
1
For reference, the motor noises peaked at 51.3 dBA measured at a distance of 1.5 m length and 1.5 m height from the center of the base of the robot. The measurements were made using a calibrated Apex 435 condenser microphone. Measured under the same conditions, without the motors running, the ambient noise in the room was measured at 42.5 dBA.
 
Literature
1.
go back to reference Kidd, Cory D., and Cynthia Breazeal. 2004. Effect of a robot on user perceptions. In Proceedings of 2004 IEEE/RSJ international conference on intelligent robots and systems, 2004(IROS 2004), vol. 4, 3559–3564. IEEE. Kidd, Cory D., and Cynthia Breazeal. 2004. Effect of a robot on user perceptions. In Proceedings of 2004 IEEE/RSJ international conference on intelligent robots and systems, 2004(IROS 2004), vol. 4, 3559–3564. IEEE.
2.
go back to reference Bainbridge, Wilma A., Justin Hart, Elizabeth S. Kim, and Brian Scassellati. 2008. The effect of presence on human-robot interaction. In The 17th IEEE international symposium on robot and human interactive communication, 2008. RO-MAN 2008, 701–706. IEEE. Bainbridge, Wilma A., Justin Hart, Elizabeth S. Kim, and Brian Scassellati. 2008. The effect of presence on human-robot interaction. In The 17th IEEE international symposium on robot and human interactive communication, 2008. RO-MAN 2008, 701–706. IEEE.
3.
go back to reference Gil, Weinberg, and Driscoll Scott. 2006. Toward robotic musicianship. Computer Music Journal 30 (4): 28–45.CrossRef Gil, Weinberg, and Driscoll Scott. 2006. Toward robotic musicianship. Computer Music Journal 30 (4): 28–45.CrossRef
4.
go back to reference Weinberg, Gil, Beck Andrew, and Godfrey Mark. 2009. Zoozbeat: A gesture-based mobile music studio Weinberg, Gil, Beck Andrew, and Godfrey Mark. 2009. Zoozbeat: A gesture-based mobile music studio
5.
go back to reference Gil, Weinberg. 2005. Interconnected musical networks: Toward a theoretical framework. Computer Music Journal 29 (2): 23–39.CrossRef Gil, Weinberg. 2005. Interconnected musical networks: Toward a theoretical framework. Computer Music Journal 29 (2): 23–39.CrossRef
6.
go back to reference Weinberg, Gil. 1999. Expressive digital musical instruments for children. PhD thesis, Massachusetts Institute of Technology. Weinberg, Gil. 1999. Expressive digital musical instruments for children. PhD thesis, Massachusetts Institute of Technology.
7.
go back to reference Weinberg, Gil, Scott Driscoll, and Travis Thatcher. 2006. Jam’aa-a middle eastern percussion ensemble for human and robotic players. In International computer music conference, 464–467. Weinberg, Gil, Scott Driscoll, and Travis Thatcher. 2006. Jam’aa-a middle eastern percussion ensemble for human and robotic players. In International computer music conference, 464–467.
8.
go back to reference Luck, Geoff, and John A. Sloboda. 2009. Spatio-temporal cues for visually mediated synchronization. Music Perception: An Interdisciplinary Journal 26 (5): 465–473.CrossRef Luck, Geoff, and John A. Sloboda. 2009. Spatio-temporal cues for visually mediated synchronization. Music Perception: An Interdisciplinary Journal 26 (5): 465–473.CrossRef
9.
go back to reference Repp, Bruno H., and Amandine Penel. 2004. Rhythmic movement is attracted more strongly to auditory than to visual rhythms. Psychological Research 68 (4): 252–270.CrossRef Repp, Bruno H., and Amandine Penel. 2004. Rhythmic movement is attracted more strongly to auditory than to visual rhythms. Psychological Research 68 (4): 252–270.CrossRef
10.
go back to reference Dirk-Jan, Povel, and Essens Peter. 1985. Perception of temporal patterns. Music Perception: An Interdisciplinary Journal 2 (4): 411–440.CrossRef Dirk-Jan, Povel, and Essens Peter. 1985. Perception of temporal patterns. Music Perception: An Interdisciplinary Journal 2 (4): 411–440.CrossRef
11.
go back to reference Müller, Meinard. 2007. Dynamic time warping. Information Retrieval for Music and Motion 69–84. Müller, Meinard. 2007. Dynamic time warping. Information Retrieval for Music and Motion 69–84.
12.
go back to reference Komatsu, Tomoaki, and Yoshihiro Miyake. 2004. Temporal development of dual timing mechanism in synchronization tapping task. In RO-MAN 2004. 13th IEEE international workshop on robot and human interactive communication (IEEE Catalog No. 04TH8759), 181–186. IEEE. Komatsu, Tomoaki, and Yoshihiro Miyake. 2004. Temporal development of dual timing mechanism in synchronization tapping task. In RO-MAN 2004. 13th IEEE international workshop on robot and human interactive communication (IEEE Catalog No. 04TH8759), 181–186. IEEE.
13.
go back to reference Crick, Christopher, Matthew Munz, and Brian Scassellati. 2006. Synchronization in social tasks: Robotic drumming. In ROMAN 2006-The 15th IEEE international symposium on robot and human interactive communication, 97–102. IEEE. Crick, Christopher, Matthew Munz, and Brian Scassellati. 2006. Synchronization in social tasks: Robotic drumming. In ROMAN 2006-The 15th IEEE international symposium on robot and human interactive communication, 97–102. IEEE.
14.
go back to reference Inderbitzin, Martin, Aleksander Väljamäe, José Maria Blanco Calvo, Paul F. M. J. Verschure, and Ulysses Bernardet. Expression of emotional states during locomotion based on canonical parameters. In Ninth IEEE international conference on automatic face and gesture recognition (FG 2011), Santa Barbara, CA, USA, 21–25 March 2011, 809–814. IEEE. Inderbitzin, Martin, Aleksander Väljamäe, José Maria Blanco Calvo, Paul F. M. J. Verschure, and Ulysses Bernardet. Expression of emotional states during locomotion based on canonical parameters. In Ninth IEEE international conference on automatic face and gesture recognition (FG 2011), Santa Barbara, CA, USA, 21–25 March 2011, 809–814. IEEE.
15.
go back to reference Hillel, Aviezer, Trope Yaacov, and Todorov Alexander. 2012. Body cues, not facial expressions, discriminate between intense positive and negative emotions. Science 338 (6111): 1225–1229.CrossRef Hillel, Aviezer, Trope Yaacov, and Todorov Alexander. 2012. Body cues, not facial expressions, discriminate between intense positive and negative emotions. Science 338 (6111): 1225–1229.CrossRef
16.
go back to reference de Gelder, Beatrice. 2006. Towards the neurobiology of emotional body language. Nature Reviews Neuroscience 7: 242–249.CrossRef de Gelder, Beatrice. 2006. Towards the neurobiology of emotional body language. Nature Reviews Neuroscience 7: 242–249.CrossRef
17.
go back to reference Nele, Dael Marcello Mortillaro, and R. Scherer Klaus. 2012. The body action and posture coding system (bap): Development and reliability. Journal of Nonverbal Behavior 36: 97–121.CrossRef Nele, Dael Marcello Mortillaro, and R. Scherer Klaus. 2012. The body action and posture coding system (bap): Development and reliability. Journal of Nonverbal Behavior 36: 97–121.CrossRef
18.
go back to reference Mark, Coulson. 2004. Attributing emotion to static body postures: Recognition accuracy, confusions, and viewpoint dependence. Journal of Nonverbal Behavior 28 (2): 117–139.CrossRef Mark, Coulson. 2004. Attributing emotion to static body postures: Recognition accuracy, confusions, and viewpoint dependence. Journal of Nonverbal Behavior 28 (2): 117–139.CrossRef
19.
go back to reference Krauss, Robert M., Palmer Morrel-Samuels, and Christina Colasante. 1991. Do conversational hand gestures communicate? Journal of Personality and Social Psychology 61 (5): 743.CrossRef Krauss, Robert M., Palmer Morrel-Samuels, and Christina Colasante. 1991. Do conversational hand gestures communicate? Journal of Personality and Social Psychology 61 (5): 743.CrossRef
20.
go back to reference Kipp, Michael, and J.-C. Martin. 2009. Gesture and emotion: Can basic gestural form features discriminate emotions? In 3rd international conference on affective computing and intelligent interaction and workshops, 2009. ACII 2009, 1–8. IEEE. Kipp, Michael, and J.-C. Martin. 2009. Gesture and emotion: Can basic gestural form features discriminate emotions? In 3rd international conference on affective computing and intelligent interaction and workshops, 2009. ACII 2009, 1–8. IEEE.
21.
go back to reference Picard, Rosalind W. 1995. Affective computing. Picard, Rosalind W. 1995. Affective computing.
22.
go back to reference Frijda, N.H. 1987. The emotions. London: Cambridge University Press. Frijda, N.H. 1987. The emotions. London: Cambridge University Press.
23.
go back to reference Kozima, Hideki, and Hiroyuki Yano. 2001. In search of otogenetic prerequisites for embodied social intelligence. In Proceedings of the workshop on emergence and development on embodied cognition; international conference on cognitive science, 30–34. Kozima, Hideki, and Hiroyuki Yano. 2001. In search of otogenetic prerequisites for embodied social intelligence. In Proceedings of the workshop on emergence and development on embodied cognition; international conference on cognitive science, 30–34.
24.
go back to reference Cynthia, Breazeal, and Aryananda Lijin. 2002. Recognition of affective communicative intent in robot-directed speech. Autonomous Robots 12 (1): 83–104.MATHCrossRef Cynthia, Breazeal, and Aryananda Lijin. 2002. Recognition of affective communicative intent in robot-directed speech. Autonomous Robots 12 (1): 83–104.MATHCrossRef
25.
go back to reference Castellano, Ginevra, Iolanda Leite, André Pereira, Carlos Martinho, Ana Paiva, and Peter W. McOwan. 2010. Affect recognition for interactive companions: challenges and design in real world scenarios. Journal on Multimodal User Interfaces 3 (1): 89–98.CrossRef Castellano, Ginevra, Iolanda Leite, André Pereira, Carlos Martinho, Ana Paiva, and Peter W. McOwan. 2010. Affect recognition for interactive companions: challenges and design in real world scenarios. Journal on Multimodal User Interfaces 3 (1): 89–98.CrossRef
26.
go back to reference Scheutz, Matthias, Paul Schermerhorn, and James Kramer. 2006. The utility of affect expression in natural language interactions in joint human-robot tasks. In Proceedings of the 1st ACM SIGCHI/SIGART conference on human-robot interaction, 226–233. ACM. Scheutz, Matthias, Paul Schermerhorn, and James Kramer. 2006. The utility of affect expression in natural language interactions in joint human-robot tasks. In Proceedings of the 1st ACM SIGCHI/SIGART conference on human-robot interaction, 226–233. ACM.
27.
go back to reference Laurence, Devillers, Vidrascu Laurence, and Lamel Lori. 2005. 2005 special issue: Challenges in real-life emotion annotation and machine learning based detection. Neural Networks 18 (4): 407–422.CrossRef Laurence, Devillers, Vidrascu Laurence, and Lamel Lori. 2005. 2005 special issue: Challenges in real-life emotion annotation and machine learning based detection. Neural Networks 18 (4): 407–422.CrossRef
28.
go back to reference Albert, Mehrabian. 1996. Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in temperament. Current Psychology 14 (4): 261–292.CrossRef Albert, Mehrabian. 1996. Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in temperament. Current Psychology 14 (4): 261–292.CrossRef
29.
go back to reference Russell, James A. 2009. Emotion, core affect, and psychological construction. Cognition and Emotion 23 (7): 1259–1283.CrossRef Russell, James A. 2009. Emotion, core affect, and psychological construction. Cognition and Emotion 23 (7): 1259–1283.CrossRef
30.
go back to reference Lindquist, Kristen A., Tor D. Wager, Hedy Kober, Eliza Bliss-Moreau, and Lisa Feldman Barrett. 2012. The brain basis of emotion: A meta-analytic review. Behavioral and Brain Sciences 35 (03): 121–143.CrossRef Lindquist, Kristen A., Tor D. Wager, Hedy Kober, Eliza Bliss-Moreau, and Lisa Feldman Barrett. 2012. The brain basis of emotion: A meta-analytic review. Behavioral and Brain Sciences 35 (03): 121–143.CrossRef
31.
go back to reference Katherine, Vytal, and Hamann Stephan. 2010. Neuroimaging support for discrete neural correlates of basic emotions: A voxel-based meta-analysis. Journal of Cognitive Neuroscience 22 (12): 2864–2885.CrossRef Katherine, Vytal, and Hamann Stephan. 2010. Neuroimaging support for discrete neural correlates of basic emotions: A voxel-based meta-analysis. Journal of Cognitive Neuroscience 22 (12): 2864–2885.CrossRef
32.
go back to reference Stephan, Hamann. 2012. Mapping discrete and dimensional emotions onto the brain: controversies and consensus. Trends in Cognitive Sciences. Stephan, Hamann. 2012. Mapping discrete and dimensional emotions onto the brain: controversies and consensus. Trends in Cognitive Sciences.
33.
go back to reference Giovanna, Colombetti. 2009. From affect programs to dynamical discrete emotions. Philosophical Psychology 22 (4): 407–425.CrossRef Giovanna, Colombetti. 2009. From affect programs to dynamical discrete emotions. Philosophical Psychology 22 (4): 407–425.CrossRef
34.
go back to reference Barrett, Lisa Feldman, Maria Gendron, and Yang-Ming Huang. 2009. Do discrete emotions exist? Philosophical Psychology 22 (4): 427–437.CrossRef Barrett, Lisa Feldman, Maria Gendron, and Yang-Ming Huang. 2009. Do discrete emotions exist? Philosophical Psychology 22 (4): 427–437.CrossRef
35.
go back to reference John, Lasseter. 1987. Principles of traditional animation applied to 3D computer animation. Computer Graphics 21 (4): 35–44.CrossRef John, Lasseter. 1987. Principles of traditional animation applied to 3D computer animation. Computer Graphics 21 (4): 35–44.CrossRef
36.
go back to reference Gielniak, Michael J., and Andrea L. Thomaz. 2011. Anticipation in robot motion. Gielniak, Michael J., and Andrea L. Thomaz. 2011. Anticipation in robot motion.
37.
go back to reference Cauell, Justine, Tim Bickmore, Lee Campbell, and Hannes Vilhjdlmsson. 2000. Designing embodied conversational agents. Embodied Conversational Agents 29. Cauell, Justine, Tim Bickmore, Lee Campbell, and Hannes Vilhjdlmsson. 2000. Designing embodied conversational agents. Embodied Conversational Agents 29.
38.
go back to reference Nayak, Vishal, and Matthew Turk. 2005. Emotional expression in virtual agents through body language. Advances in Visual Computing 313–320. Nayak, Vishal, and Matthew Turk. 2005. Emotional expression in virtual agents through body language. Advances in Visual Computing 313–320.
39.
go back to reference Salem, Maha, Stefan Kopp, Ipke Wachsmuth, and Frank Joublin. 2010. Generating robot gesture using a virtual agent framework. In 2010 IEEE/RSJ international conference on intelligent robots and systems (IROS), 3592–3597. IEEE. Salem, Maha, Stefan Kopp, Ipke Wachsmuth, and Frank Joublin. 2010. Generating robot gesture using a virtual agent framework. In 2010 IEEE/RSJ international conference on intelligent robots and systems (IROS), 3592–3597. IEEE.
40.
go back to reference Riek, Laurel D., T.-C. Rabinowitch, Paul Bremner, Anthony G. Pipe, Mike Fraser, and Peter Robinson. 2010. Cooperative gestures: Effective signaling for humanoid robots. In 2010 5th ACM/IEEE international conference on human-robot interaction (HRI), 61–68. IEEE. Riek, Laurel D., T.-C. Rabinowitch, Paul Bremner, Anthony G. Pipe, Mike Fraser, and Peter Robinson. 2010. Cooperative gestures: Effective signaling for humanoid robots. In 2010 5th ACM/IEEE international conference on human-robot interaction (HRI), 61–68. IEEE.
41.
go back to reference Moon, A.J., Chris A.C. Parker, Elizabeth A. Croft, and H.F. Van der Loos., 2013. Design and impact of hesitation gestures during human-robot resource conflicts. Journal of Human-Robot Interaction 2 (3): 18–40. Moon, A.J., Chris A.C. Parker, Elizabeth A. Croft, and H.F. Van der Loos., 2013. Design and impact of hesitation gestures during human-robot resource conflicts. Journal of Human-Robot Interaction 2 (3): 18–40.
42.
go back to reference Maha, Salem, Kopp Stefan, Wachsmuth Ipke, Rohlfing Katharina, and Joublin Frank. 2012. Generation and evaluation of communicative robot gesture. International Journal of Social Robotics 4 (2): 201–217.CrossRef Maha, Salem, Kopp Stefan, Wachsmuth Ipke, Rohlfing Katharina, and Joublin Frank. 2012. Generation and evaluation of communicative robot gesture. International Journal of Social Robotics 4 (2): 201–217.CrossRef
43.
go back to reference Breazeal, Cynthia, Andrew Wang, and Rosalind Picard. 2007. Experiments with a robotic computer: body, affect and cognition interactions. In 2007 2nd ACM/IEEE international conference on human-robot interaction (HRI), 153–160. IEEE. Breazeal, Cynthia, Andrew Wang, and Rosalind Picard. 2007. Experiments with a robotic computer: body, affect and cognition interactions. In 2007 2nd ACM/IEEE international conference on human-robot interaction (HRI), 153–160. IEEE.
44.
go back to reference Hoffman, Guy, and Cynthia Breazeal. 2008. Anticipatory perceptual simulation for human-robot joint practice: Theory and application study. In Proceedings of the 23rd national conference on artificial intelligence—Volume 3, AAAI’08, 1357–1362. AAAI Press. Hoffman, Guy, and Cynthia Breazeal. 2008. Anticipatory perceptual simulation for human-robot joint practice: Theory and application study. In Proceedings of the 23rd national conference on artificial intelligence—Volume 3, AAAI’08, 1357–1362. AAAI Press.
45.
go back to reference Michalowski, Marek P., Selma Sabanovic, and Hideki Kozima. 2007. A dancing robot for rhythmic social interaction. In 2007 2nd ACM/IEEE international conference on human-robot interaction (HRI), 89–96. IEEE. Michalowski, Marek P., Selma Sabanovic, and Hideki Kozima. 2007. A dancing robot for rhythmic social interaction. In 2007 2nd ACM/IEEE international conference on human-robot interaction (HRI), 89–96. IEEE.
46.
go back to reference Monceaux, Jérôme, Joffrey Becker, Céline Boudier, and Alexandre Mazel. 2009. Demonstration: First steps in emotional expression of the humanoid robot nao. In Proceedings of the 2009 international conference on multimodal interfaces, 235–236. ACM. Monceaux, Jérôme, Joffrey Becker, Céline Boudier, and Alexandre Mazel. 2009. Demonstration: First steps in emotional expression of the humanoid robot nao. In Proceedings of the 2009 international conference on multimodal interfaces, 235–236. ACM.
47.
go back to reference Grunberg, David K., Alyssa M. Batula, Erik M. Schmidt, and Youngmoo E. Kim. 2012. Synthetic emotions for humanoids: Perceptual effects of size and number of robot platforms. International Journal of Synthetic Emotions (IJSE) 3 (2): 68–83.CrossRef Grunberg, David K., Alyssa M. Batula, Erik M. Schmidt, and Youngmoo E. Kim. 2012. Synthetic emotions for humanoids: Perceptual effects of size and number of robot platforms. International Journal of Synthetic Emotions (IJSE) 3 (2): 68–83.CrossRef
48.
go back to reference Kidd, Cory David. 2003. Sociable robots: The role of presence and task in human-robot interaction. PhD thesis, Massachusetts Institute of Technology. Kidd, Cory David. 2003. Sociable robots: The role of presence and task in human-robot interaction. PhD thesis, Massachusetts Institute of Technology.
49.
go back to reference Walters, Michael L., Kerstin Dautenhahn, René Te Boekhorst, Kheng Lee Koay, Dag Sverre Syrdal, and Chrystopher L. Nehaniv. 2009. An empirical framework for human-robot proxemics. Procs of New Frontiers in Human-Robot Interaction. Walters, Michael L., Kerstin Dautenhahn, René Te Boekhorst, Kheng Lee Koay, Dag Sverre Syrdal, and Chrystopher L. Nehaniv. 2009. An empirical framework for human-robot proxemics. Procs of New Frontiers in Human-Robot Interaction.
50.
go back to reference Takayama, Leila, and Caroline Pantofaru. 2009. Influences on proxemic behaviors in human-robot interaction. In IEEE/RSJ international conference on intelligent robots and systems, 2009. IROS 2009, 5495–5502. IEEE. Takayama, Leila, and Caroline Pantofaru. 2009. Influences on proxemic behaviors in human-robot interaction. In IEEE/RSJ international conference on intelligent robots and systems, 2009. IROS 2009, 5495–5502. IEEE.
51.
go back to reference Mead, Ross, Amin Atrash, and Maja J. Mataric. 2011. Recognition of spatial dynamics for predicting social interaction. In Proceedings of the 6th international conference on human-robot interaction, 201–202. ACM. Mead, Ross, Amin Atrash, and Maja J. Mataric. 2011. Recognition of spatial dynamics for predicting social interaction. In Proceedings of the 6th international conference on human-robot interaction, 201–202. ACM.
52.
go back to reference Breazeal, C. 2003. Emotion and sociable humanoid robots. International Journal of Human-Computer Studies 15: 119–155.CrossRef Breazeal, C. 2003. Emotion and sociable humanoid robots. International Journal of Human-Computer Studies 15: 119–155.CrossRef
53.
go back to reference Velásquez, Juan D. 1997. Modeling emotions and other motivations in synthetic agents. In Proceedings of the national conference on artificial intelligence, 10–15. Citeseer. Velásquez, Juan D. 1997. Modeling emotions and other motivations in synthetic agents. In Proceedings of the national conference on artificial intelligence, 10–15. Citeseer.
54.
go back to reference Xia, Guangyu, Roger Dannenberg, Junyun Tay, and Manuela Veloso. Autonomous robot dancing driven by beats and emotions of music. In Proceedings of the 11th international conference on autonomous agents and multiagent systems—Volume 1, AAMAS ’12, 205–212; Richland, S.C. 2012. International foundation for autonomous agents and multiagent systems. Xia, Guangyu, Roger Dannenberg, Junyun Tay, and Manuela Veloso. Autonomous robot dancing driven by beats and emotions of music. In Proceedings of the 11th international conference on autonomous agents and multiagent systems—Volume 1, AAMAS ’12, 205–212; Richland, S.C. 2012. International foundation for autonomous agents and multiagent systems.
55.
go back to reference Traue, Harald C., Frank Ohl, André Brechmann, Friedhelm Schwenker, Henrik Kessler, Kerstin Limbrecht, Holger Hoffmann, Stefan Scherer, Michael Kotzyba, Andreas Scheck, et al. 2013. A framework for emotions and dispositions in man-companion interaction. Coverbal Synchrony in Human-Machine Interaction, 99. Traue, Harald C., Frank Ohl, André Brechmann, Friedhelm Schwenker, Henrik Kessler, Kerstin Limbrecht, Holger Hoffmann, Stefan Scherer, Michael Kotzyba, Andreas Scheck, et al. 2013. A framework for emotions and dispositions in man-companion interaction. Coverbal Synchrony in Human-Machine Interaction, 99.
56.
go back to reference Frijda, N.H. 1995. Emotions in robots. Comparative approaches to cognitive science, ed. by H.L. Roitblat and J.-A. Meyer, 501–516. Frijda, N.H. 1995. Emotions in robots. Comparative approaches to cognitive science, ed. by H.L. Roitblat and J.-A. Meyer, 501–516.
57.
58.
go back to reference Walbott, H.G. 1998. Bodily expression of emotion. European Journal of Social Psychology 28 (6): 879–896.CrossRef Walbott, H.G. 1998. Bodily expression of emotion. European Journal of Social Psychology 28 (6): 879–896.CrossRef
60.
go back to reference Sullivan, Jean, Linda A. Camras, and Michel George. 1993. Do infants express discrete emotions? Adult judgments of facial, vocal, and body actions. Journal of Nonverbal Behavior 17: 171–186.CrossRef Sullivan, Jean, Linda A. Camras, and Michel George. 1993. Do infants express discrete emotions? Adult judgments of facial, vocal, and body actions. Journal of Nonverbal Behavior 17: 171–186.CrossRef
62.
go back to reference Ghias, Asif, Jonathan Logan, David Chamberlin, and Brian C. Smith. 1995. Query by humming: Musical information retrieval in an audio database. In Proceedings of the third ACM international conference on multimedia, 231–236. ACM. Ghias, Asif, Jonathan Logan, David Chamberlin, and Brian C. Smith. 1995. Query by humming: Musical information retrieval in an audio database. In Proceedings of the third ACM international conference on multimedia, 231–236. ACM.
63.
go back to reference Shi, Jianbo, and Carlo Tomasi. 1994. Good features to track. In 1994 IEEE computer society conference on computer vision and pattern recognition, 1994. Proceedings CVPR’94, 593–600. IEEE. Shi, Jianbo, and Carlo Tomasi. 1994. Good features to track. In 1994 IEEE computer society conference on computer vision and pattern recognition, 1994. Proceedings CVPR’94, 593–600. IEEE.
64.
go back to reference Hoffman, Guy. 2012. Dumb robots, smart phones: A case study of music listening companionship. In 2012 IEEE, RO-MAN, 358–363. IEEE. Hoffman, Guy. 2012. Dumb robots, smart phones: A case study of music listening companionship. In 2012 IEEE, RO-MAN, 358–363. IEEE.
65.
go back to reference Puckette, Miller S., Miller S. Puckette Ucsd, Theodore Apel, et al. 1998. Real-time audio analysis tools for Pd and MSP. Puckette, Miller S., Miller S. Puckette Ucsd, Theodore Apel, et al. 1998. Real-time audio analysis tools for Pd and MSP.
66.
go back to reference Davies, Matthew E.P., and Mark D. Plumbley. 2004. Causal tempo tracking of audio. In ISMIR. Davies, Matthew E.P., and Mark D. Plumbley. 2004. Causal tempo tracking of audio. In ISMIR.
67.
go back to reference Sun, Sisi, Trishul Mallikarjuna, and Gil Weinberg. Effect of visual cues in synchronization of rhythmic patterns. Sun, Sisi, Trishul Mallikarjuna, and Gil Weinberg. Effect of visual cues in synchronization of rhythmic patterns.
68.
go back to reference Albin, Aaron, S. Lee, and Parag Chordia. 2011. Visual anticipation aids in synchronization tasks. In Proceedings of the 2007 society for music perception and cognition (SMPC). Albin, Aaron, S. Lee, and Parag Chordia. 2011. Visual anticipation aids in synchronization tasks. In Proceedings of the 2007 society for music perception and cognition (SMPC).
69.
go back to reference Burkhardt, Felix. 2005. Emofilt: The simulation of emotional speech by prosody-transformation. INTERSPEECH 509–512. Burkhardt, Felix. 2005. Emofilt: The simulation of emotional speech by prosody-transformation. INTERSPEECH 509–512.
Metadata
Title
“Be Social”—Embodied Human-Robot Musical Interactions
Authors
Gil Weinberg
Mason Bretan
Guy Hoffman
Scott Driscoll
Copyright Year
2020
DOI
https://doi.org/10.1007/978-3-030-38930-7_5

Premium Partners