Abstract
Creating social robots that can interact with humans autonomously is a growing and promising field of research. Indeed, there has been a significant increase in the number of platforms and applications for social robots. However, robots are not yet able to interact with humans in a natural and believable way. This is especially true for physically realistic robot that can be affected by the Uncanny Valley. This chapter is looking at motion control for a physically realistic robot named Nadine. Robot controllers for such robot need to produce behaviours that match the physical realism of the robot. This chapter describes a robot controller that allows such a robot to fully use the same modalities as humans during interaction. These include speech, facial and bodily expressions.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
- 2.
- 3.
- 4.
- 5.
- 6.
- 7.
- 8.
- 9.
- 10.
References
Argyle M (1975) Bodily communication. Methuen, London
Aryel B, Lola C, Luisa D, Giacomo S, Fabio T, Piero C (2011) Children interpretation of emotional body language displayed by a robot. Soc Robot (2011-01-01):62–70
Atkinson AP, Dittrich WH, Gemmell AJ, Young AW (2004) Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception 33(6):717–746
Barakova El, Tourens T (2010) Expressing and interpreting emotional movements in social games with robots. Personal Ubiquitous Comput 14:457–467
Beck A, Cañamero L, Hiolle A, Damiano L, Cosi P, Tesser F, Sommavilla G (2013) Interpretation of emotional body language displayed by a humanoid robot: a case study with children. Int J Soc Robot 5(3):325–334
Beck A, Hiolle A, Cañamero L (2013) Using perlin noise to generate emotional expressions in a robot. In: Proceedings of annual meeting of the cognitive science society (Cog Sci 2013), pp 1845–1850
Beck A, Hiolle A, Mazel A, Cañamero L (2010) Interpretation of emotional body language displayed by robots. In: Proceedings of the 3rd international workshop on affective interaction in natural environments. ACM, pp 37–42
Beck A, Stevens B, Bard KA, Cañamero L (2012) Emotional body language displayed by artificial agents. ACM Trans Inter Intell Syst 2(1):2:1–2:29
Bee N, Haring M, Andre E (2011) Creation and evaluation of emotion expression with body movement, sound and eye color for humanoid robots. In: Ro-Man 2011, IEEE, pp 204–209
Belpaeme T, Baxter PE, Read R, Wood R, Cuayáhuitl H, Kiefer B, Racioppa S, Kruijff-Korbayová I, Athanasopoulos G, Enescu V et al (2012) Multimodal child-robot interaction: building social bonds. J Hum-Robot Inter 1(2):33–53
Bernhardt D (2010) Emotion inference from human body. PhD thesis, University of Cambridge, Computer Laboratory
Bethel CL, Murphy RR (2008) Survey of non-facial/non-verbal affective expressions for appearance-constrained robots. IEEE Trans Syst Man Cybern Part C: Appl Rev 38(1):83–92
Breazeal C (2002) Designing sociable robots. Intelligent robotics and autonomous agents. MIT press, Cambridge
Cai B, Zhang Y (2012) Different-level redundancy-resolution and its equivalent relationship analysis for robot manipulators using gradient-descent and zhang ’s neural-dynamic methods. IEEE Trans Ind Electron 59(8):3146–3155
Camurri A, Mazzarino B, Volpe G (2003) Analysis of expressive gesture: the eyesweb expressive gesture processing library. In: Gesture-based communication in human-computer interaction. LNAI, pp 460–467
Chan TF, Dubey RV (1995) A weighted least-norm solution based scheme for avoiding joint limits for redundant joint manipulators. IEEE Trans Robot Autom 11(2):286–292
Cheng F-T, Chen T-H, Sun Y-Y (1994) Resolving manipulator redundancy under inequality constraints. IEEE Trans Robot Autom 10(1):65–71
Coombes SA, Cauraugh JH, Janelle CM (2006) Emotion and movement: activation of defensive circuitry alters the magnitude of a sustained muscle contraction. Neurosci Lett 396(3):192–196
Dautenhahn K (2007) Socially intelligent robots: dimensions of human-robot interaction. Philos Trans Royal Soc B: Biol Sci 362(1480):679–704
Dautenhahn K, Nehaniv CL, Walters ML, Robins B, Kose-Bagci H, Blow M (2009) Kaspar—a minimally expressive humanoid robot for human–robot interaction research. Appl Bionics Biomech 6(3, 4):369–397
De Silva PR, Bianchi-Berthouze N (2004) Modeling human affective postures: an information theoretic characterization of posture features. Comput Animation Virtual Worlds 15(3–4):269–276
Fong T, Nourbakhsh I, Dautenhahn K (2003) A survey of socially interactive robots. Robot Auton Syst 42(3):143–166
Guo D, Zhang Y (2012) A new inequality-based obstacle-avoidance mvn scheme and its application to redundant robot manipulators. IEEE Trans Syst Man Cybern Part C: Appl Rev 42(6):1326–1340
Hartmann B, Mancini M, Buisine S, Pelachaud C (2005) Design and evaluation of expressive gesture synthesis for embodied conversational agents. In: Proceedings of 4th international joint conference on autonomous agents and multiagent systems, AAMAS’05. ACM, New York, NY, USA, pp 1095–1096
Ho C-C, MacDorman KF, Pramono ZADD (2008) Human emotion and the uncanny valley: a glm, mds, and isomap analysis of robot video ratings. In: Proceedings of the 3rd ACM/IEEE international conference on human robot interaction, HRI’08. ACM, New York, NY, USA, pp 169–176
Kanoun O, Lamiraux F, Wieber PB (2011) Kinematic control of redundant manipulators: generalizing the task-priority framework to inequality task. IEEE Trans Robot 27(4):785–792
Kim HJ (2011) Optimization of throwing motion planning for whole-body humanoid mechanism: sidearm and maximum distance. Mech Mach Theory 46(4):438–453
Kleinsmith A, De Silva PR, Bianchi-Berthouze N (2006) Cross-cultural differences in recognizing affect from body posture. Interact Comput 18(6):1371–1389
Kleinsmith A, Bianchi-Berthouze N, Steed A (2011) Automatic recognition of non-acted affective postures. IEEE transactions on systems man, and cybernetics part B
Laban R, Ullmann L (1971) The mastery of movement. Plays, Inc, Boston
Leite I, Castellano G, Pereira A, Martinho C, Paiva A (2012) Modelling empathic behaviour in a robotic game companion for children: an ethnographic study in real-world settings. In: Proceedings of the seventh annual ACM/IEEE international conference on human-robot interaction, HRI’12. ACM, New York, NY, USA, pp 367–374
Lola C (2006) Did garbo care about the uncanny valley? commentary to K.F. Macdorman and H. Ishiguro, the uncanny advantage of using androids in cognitive and social science research. Inter Stud 7:355–359
Marc C (2004) Attributing emotion to static body postures: recognition accuracy, confusions, and viewpoint dependence. J Nonverbal Behav 28:117–139
Martin S, Christoph B (2010) Perception of affect elicited by robot motion. In: International Conference on Human Robot Interaction, ACM/IEEE
Martins AM, Dias AM, Alsina PJ (2006) Comments on manipulability measure in redundant planar manipulators. In: Proceedings of IEEE Latin American robotics symposium (LARS 06), pp 169–173
Ma S, Watanabe M (2002) Time-optimal control of kinematically redundant manipulators with limit heat characteristics of actuators. Adv Robot 16(8):735–749
Metta G, Sandini G, Vernon D, Natale L, Nori F (2008) The icub humanoid robot: an open platform for research in embodied cognition. In: Proceedings of the 8th workshop on performance metrics for intelligent systems, pp 50–56
Miyashita T, Ishiguro H (2004) Human-like natural behavior generation based on involuntary motions for humanoid robots. Robot Auton Syst 48(4):203–212
Mori M (1970) Bukimi no tani [the un-canny valley]. Energy 7:33–35
Nalin M, Baroni I, Kruijff-Korbayova I, Canamero L, Lewis M, Beck A, Cuayahuitl H, Sanna A (2012) Children’s adaptation in multi-session interaction with a humanoid robot. In: International symposium on robot and human interactive communication (RO-MAN). IEEE
Nunez JV, Briseno A, Rodriguez DA, Ibarra JM, Rodriguez VM (2012) Explicit analytic solution for inverse kinematics of bioloid humanoid robot. In: Robotics symposium and Latin American robotics symposium (SBR-LARS), 2012 Brazilian, pp 33–38
Pierris G, Lagoudakis MG (2009) An interactive tool for designing complex robot motion patterns. In: Proceedings of IEEE international conference on robotics and automation (ICRA 09), pp 4013–4018
Read R, Belpaeme T (2013) People interpret robotic non-linguistic utterances categorically. In: 2013 8th ACM/IEEE international conference on Human-Robot Interaction (HRI), pp 209–210. March 2013
Robins B, Dautenhahn K (2007) Encouraging social interaction skills in children with autism playing with robots: a case study evaluation of triadic interactions involving children with autism, other people (peers and adults) and a robotic toy. ENFANCE 59:72–81
Roether CL, Omlor L, Christensen A, Giese MA (2009) Critical features for the perception of emotion from gait. J Vision 9(6):15
Rosenthal-von der Pütten AM, Krämer NC, Becker-Asano C, Ogawa K, Nishio S, Ishiguro H (2014) The uncanny in the wild. Analysis of unscripted human–android interaction in the field international. J Soc Robot 6(1):67–83
Saerbeck M, Bartneck C (2010) Attribution of affect to robot motion. In: 5th ACM/IEEE international conference on human-robot interaction (HRI2010). ACM, Osaka, pp 53–60
Smith LB, Breazeal C (2007) The dynamic lift of developmental process. Dev Sci 10(1):61–68
Taghirad HD, Nahon M (2008) Kinematic analysis of a macro-micro redundantly actuated parallel manipulator. Adv Robot 22(6–7):657–687
Takahashi Y, Kimura T, Maeda Y, Nakamura T (2012) Body mapping from human demonstrator to inverted-pendulum mobile robot for learning from observation. In: Proceedings of IEEE conference on fuzzy systems (FUZZ-IEEE 2012), pp 1–6
Thomas F, Johnston O (1995) The illusion of life. Abbeville-Press, New-York
Torta E, Oberzaucher J, Werner F, Cuijpers RH, Juola JF (2012) Attitudes towards socially assistive robots in intelligent homes: results from laboratory studies and field trials. Journal of Human-Robot. Interaction 1(2):76–99
van Breemen A, Yan X, Meerbeek B (2005) icat: an animated user-interface robot with personality. In: Proceedings of the fourth international joint conference on autonomous agents and multiagent systems, AAMAS’05. ACM, New York, NY, USA, pp 143–144
Wainer J, Dautenhahn K, Robins B, Amirabdollahian F (2014) A pilot study with a novel setup for collaborative play of the humanoid robot kaspar with children with autism. Int J Soc Robot 6(1):45–65
Wallbott HG (1998) Bodily expression of emotion. Eur J Soc Psychol 28(6):879–896
Walters ML, Dautenhahn K, Boekhorst RT, Koay KL, Syrdal DS, Nehaniv CL (2009) An empirical framework for human-robot proxemics. Procs of new frontiers in human-robot interaction
Wang J, Li Y (2009) Inverse kinematics analysis for the arm of a mobile humanoid robot based on the closed-loop algorithm. In: International conference on information and automation, 2009. ICIA’09. pp 516–521
Wang J, Li Y (2009) Inverse kinematics analysis for the arm of a mobile humanoid robot based on the closed-loop algorithm. In: Proceedings of international conference on information and automation (ICIA 2009), pp. 516–521
Xiao Y, Zhang Z, Beck A, Yuan J, Thalmann D (2014) Human-robot interaction by understanding upper body gestures. Presence 23(2):133–154
Zhang Z (2012) Motion planning and control of redundant manipulator from fixed base to mobile platfrom. Ph.D dissertation, Sun Yat-sen University (2012)
Zhang Y, Huarong W, Zhang Z, Xiao L, Guo Dongsheng (2013) Acceleration-level repetitive motion planning of redundant planar robots solved by a simplified lvi-based primal-dual neural network. Robot Comput-Integr Manuf 29(2):328–343
Zhang Z, Beck A, Thalmann NM Human-like behavior generation based on head-arms model for robot tracking external targets and body parts. IEEE Transaction on Cybernetics, Accepted for publication
Zhang Y, Tan Z, Yang Z, Lv X, Chen K (2008) A simplified lvi-based primal-dual neural network for repetitive motion planning of pa10 robot manipulator starting from different initial states. In: Proceedings of IEEE joint conference on neural networks (IJCNN 2008), pp. 19–24
Zhang Z, Zhang Y (2012) Acceleration-level cyclic-motion generation of constrained redundant robots tracking different paths. IEEE Trans Syst Man Cybern Part B: Cybern 42(4):1257–1269
Zhang Z, Zhang Y (2013) Equivalence of different-level schemes for repetitive motion planning of redundant robots. Acta Automatica Sinica 39(1):88–91
Zhang Z, Zhang Y (2013) Variable joint-velocity limits of redundant robot manipulators handled by quadratic programming. IEEE/ASME Trans Mechatron 18(2):674–686
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Beck, A., Zhijun, Z., Magnenat-Thalmann, N. (2016). Motion Control for Social Behaviors. In: Magnenat-Thalmann, N., Yuan, J., Thalmann, D., You, BJ. (eds) Context Aware Human-Robot and Human-Agent Interaction. Human–Computer Interaction Series. Springer, Cham. https://doi.org/10.1007/978-3-319-19947-4_11
Download citation
DOI: https://doi.org/10.1007/978-3-319-19947-4_11
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-19946-7
Online ISBN: 978-3-319-19947-4
eBook Packages: Computer ScienceComputer Science (R0)