Skip to main content
Top
Published in: KI - Künstliche Intelligenz 3-4/2022

Open Access 01-11-2022 | Dissertation and Habilitation Abstracts

Human-Robot Body Experience: An Artificial Intelligence Perspective

Author: Philipp Beckerle

Published in: KI - Künstliche Intelligenz | Issue 3-4/2022

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Human body experience is remarkably flexible, which enables us to integrate passive tools as well as intelligent robotic devices into our body representation. Accordingly, it can serve as a role model to make (assistive) robots interact seamlessly with their users or to provide (humanoid) robots with a human-like self-perception and behavior generation. This article discusses the potential of understanding human body experience and applying it to robotics. Particular focus is set on how to use artificial intelligence techniques and create intelligent artificial agents from insights about human body experience. The discussion is based on a summary of the author’s habilitation thesis and combines theoretical and experimental perspectives from psychology, cognitive science and neuroscience as well as computer science, engineering, and artificial intelligence. From this, it derives directions for future developments towards creating artificial body intelligence with human-like capabilities.

1 Introduction

Paramount results of psychological and neuroscientific research created a surge of interest in human body experience [5, 6]. Understanding how humans experience their bodies and extend their body representations to integrate tools [9, 10] is not only fundamentally challenging but also promising to push a lot of applications, especially in robotics [2, 24].
A striking example is the rubber hand illusion, which describes the embodiment of artificial body parts (or even non-anthropomorphic objects) by human individuals due to crossmodal integration of vision, touch, and proprioception [6]. Discussing this matter in cognitive science and artificial intelligence domains requires caution regarding terminology: “embodiment” is here understood to describe whether a human has a sense of ownership, location, and agency over an artifact [14] rather than an agent having a physical body as in “embodied” and situated cognition [25] or human-computer interaction [18]. Contemporary research discusses how technical means could shape this multisensory integration and how robotics can help to shed light on fundamental, human-related research questionsin return [2, 19].
When artifacts are intelligent agents themselves, e.g., assistive robots, the spatial and temporal alignment between interaction strategies, human cognitive reasoning, and motor control to achieve common aims is a challenging task. This is particularly true if both agents, the human user and the machine, learn adapting to each other, i.e., mutual adaptation [2, 20, 26]. Taking the human part as an inspiration, e.g., via cognitive models, could help to achieve a seamless interaction with assistive robots, i.e., a joint human-robot body experience, as well as to provide autonomous (humanoid) robots with human-like behavior generation [22].
This article summarizes the ideas and key findings of the author’s habilitation thesis [1] and discusses them from an artificial intelligence perspective. The interdisciplinary research questions focused by the thesis and this article are:
  • How does human body experience relate to robot, control, and (haptic) interface design? How can this be considered in development?
  • Which human-in-the-loop experiments can help to empirically examine the users’ experiences?
  • Could human-like artificial body intelligence be realized? Would that be desirable?
Section 2 presents experimental approaches to probe the underpinning mechanisms of human body experience and contributes by explaining how they might be influenced and shaped by robot design and control. This is particularly important for the development of human-machine interfaces: Section 3 discusses results from the experiments of the thesis and puts forward design recommendations. Considering technical influence factors, Section 4 furthermore outlines the value and shortcomings of cognitive modeling in this regard and how artificial intelligence techniques might improve their capabilities. Finally, Section 5 concludes the article suggesting future directions towards endowing robots with an artificial body intelligence.

2 Probing Human Body Experience

As understanding if and how non-corporal objects are embodied by humans is of high interest for fundamental psychological and neuroscientific research as well as for engineering applications, the last decades brought up a multitude of experimental approaches. Rubber limb illusion paradigms play a key role in these endeavors: A rubber limb, e.g., a hand, is placed in sight of the participant while both, the hidden real and the fake limb are haptically stimulated [5, 14]. Most participants experience the feeling to embody the fake hand, which has been shown via objective and subjective assessment [6].
Based on these good prospects, interdisciplinary research started applying technical means to probe the complexity and plasticity of human bodily experience [1]. Two common ways to extend the experimental possibilities are the involvement of robotic devices and virtual reality. Both have different benefits as virtual reality opens up a very broad space of investigation, whereas using robotic devices potentially allows for direct transfer to technical implementation, e.g., in prosthetics or telerobotics [2]. Both approaches can turn existing psychological paradigms into interactive human-in-the-loop experiments. Those allow for the consideration of mutual adaptation if implementing intelligent agents and can feedback insights and new research questions from application into fundamental science [2, 19]. However, the design of such experiments is subject to challenging requirements of which five design factors were shown to be of crucial importance: hiding the real limb, anatomical plausibility, visual appearance, temporal delay, and software-controlled experimental conditions [1].

3 Human-Machine Interfaces

The authors’ habilitation thesis reports on robotic hand and leg illusion experiments as well as virtual hand illusion studies [1]. Several studies using such technical augmentations outlined how robot design and control influence embodiment and, particularly, the relevance of (haptic) human-machine interfaces: a similar contribution of haptic and motor feedback to embodiment has been found in the upper limbs [11]. Moreover, embodiment was shown to be an appropriate assessment metric to consider users’ experiences in interface design using human-in-the-loop approaches [8] and outlined to be promising for the lower limbs as well [17]. It should be noted that non-instrumental aspects of haptic feedback might decisively contribute to device embodiment and should be considered in interface design, e.g., the mediation of affective and social information [3].
Agency, which describes whether humans feel to be in control of their actions, can be seen as a subfactor of embodiment and seems suitable as an objective measure of task-appropriate and intuitive assistance [7]. This provides additional experimental approaches and metrics to human-centered robotics and interface design and underlines the potential of technically augmented psychological paradigms to improve human-robot interaction.

4 Cognitive Body Models

While the experimental evaluation of human-robot body experience can provide remarkable information to design, a broader and fundamental understanding of the underpinning psychological effects is of scientific interest. This, in turn, has technical potential through capturing it with artificial intelligence techniques. In this respect, cognitive models of body experience are discussed to be applied in robotics [22]. Drawing from latest results from cognitive science research, Bayesian or connectionist models, predictive coding, and cognitive architectures are promising routes to understand how humans experience their bodies and how we might make consider that in robot design [1, 12].
If we manage to build models of human perceptual and cognitive processes with respect to body experience, this could enable various novel technical possibilities [1]. Thinking of assistive robots, e.g., prostheses, we could use cognitive models to probe whether the device is embodied or not and how we could adapt to improve the users’ body experiences via control [22]. Still, a theoretical framework to explain limb embodiment is lacking, particularly when considering structurally varying bodies, e.g., in case of amputation [4]. Modular modeling frameworks combining bottom-up multisensory integration with cognitive reasoning and top-down adaptation, e.g., learning a person’s predispositions and experiences, could be an approach to represent body experience flexibly [4, 13]. Beyond applications in assistive robotics, this might also help to endow humanoid robots with human-like body representations, which could improve their interaction capabilities and provide them with more human-like action-perception versatility [22].
So far, Bayesian cognitive models of multimodal sensory processing in rubber limb illusion paradigms have been shown to roughly predict experimental outcomes [21, 23]. However, key issues of cognitive body experience models, i.e., limitations of accuracy, individualizability, and online capabilities, remain despite promising suggestions for methodical extension [4, 13, 22].

5 Towards Artificial Body Intelligence

The outlined potential of providing robots with their own body experience or making assistive devices develop a joint human-robot body experience with their users might call for providing robots with a body intelligence. To mimic the flexibility of the human analogon, a robotic body representation would need to plasticly align to new environmental situations. This adaptive representation should account for surrounding (human) interaction partners, but also structural changes of the robot’s own body. To this end, modular modeling frameworks could integrate task-specific algorithms for perceptual functions, sensory integration, and cognitive reasoning as outlined in Fig. 1. As suggested by Bliek et al. [4], a top-down modulation of the (sub)models’ prior knowledge through learning methods seems reasonable.
Fig. 1 presents a potential structure for artificial body intelligence based on the considerations of Bliek et al. [4]. A bottom-up path describes the processes of sensation and perception, multisensory integration, and cognition: multimodal sensory data is gathered and integrated to a common sensory representation [16] to update the situation-specific body experience. A top-down prior modulation could continuously update body and environment knowledgeand might also be provided to update the memory on cognitive level.
Aiming at a broader, potentially general, artificial body intelligence, one might consider to not only adapt parameters, but also make structural changes. Considering Marr’s levels of analysis [15, 27], such modifications could happen on the algorithmic and the computational level. While the former could mean to exchange submodels and subalgorithms for different tasks (suggestions are provided in Fig. 1), the latter would imply to structurally align the topology of the body representation model, i.e., adding, removing, or rearranging modules.
This might lead to an artificial body intelligence that mimics the flexibility of the human analogon and could plasticly align in case of structural alterations of their body [4]. Moreover, it could adapt to changes of environment it is situated in and to human interaction partners. The experimental results presented in this article can guide aligning robot hardware and software accordingly. Whereas this article focuses on the perception-related part of the human action-perception loop, artificial agents, e.g., robots, would not only require a human-like body experience but also motor control [26]. To inform the required developments, future human-in-the-loop experiments will also need to consider ecologically valid scenarios and long-term observation in daily life [1].

Acknowledgements

The author thanks all collaborators, students, and study participants. This work has been supported by the Deutsche Forschungsgemeinschaft (DFG; BE 5723/3 &11) and the Volkswagen-Foundation (Az. 9B 007).

Declarations

Conflict of interest

The author declares that there is no conflict of interest.
Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://​creativecommons.​org/​licenses/​by/​4.​0/​.

Our product recommendations

KI - Künstliche Intelligenz

The Scientific journal "KI – Künstliche Intelligenz" is the official journal of the division for artificial intelligence within the "Gesellschaft für Informatik e.V." (GI) – the German Informatics Society - with constributions from troughout the field of artificial intelligence.

Literature
2.
go back to reference Beckerle P, Castellini C, Lenggenhager B (2019) Robotic interfaces for cognitive psychology and embodiment research: a research roadmap. Wiley Interdiscip Rev: Cogn Sci 10(2):e1486 Beckerle P, Castellini C, Lenggenhager B (2019) Robotic interfaces for cognitive psychology and embodiment research: a research roadmap. Wiley Interdiscip Rev: Cogn Sci 10(2):e1486
3.
go back to reference Beckerle P, Kõiva R, Kirchner EA, Bekrater-Bodmann R, Dosen S, Christ O, Abbink DA, Castellini C, Lenggenhager B (2018) Feel-good robotics: Requirements on touch for embodiment in assistive robotics. Front Neurorobotics 12:84CrossRef Beckerle P, Kõiva R, Kirchner EA, Bekrater-Bodmann R, Dosen S, Christ O, Abbink DA, Castellini C, Lenggenhager B (2018) Feel-good robotics: Requirements on touch for embodiment in assistive robotics. Front Neurorobotics 12:84CrossRef
4.
go back to reference Bliek A, Bekrater-Bodmann R, Beckerle P (2021) Cognitive models of limb embodiment in structurally varying bodies: A theoretical perspective. Front Psychol 12 Bliek A, Bekrater-Bodmann R, Beckerle P (2021) Cognitive models of limb embodiment in structurally varying bodies: A theoretical perspective. Front Psychol 12
5.
go back to reference Botvinick M, Cohen J (1998) Rubber hands ‘feel’ touch that eyes see. Nature 391:756CrossRef Botvinick M, Cohen J (1998) Rubber hands ‘feel’ touch that eyes see. Nature 391:756CrossRef
6.
go back to reference Christ O, Reiner M (2014) Perspectives and possible applications of the rubber hand and virtual hand illusion in non-invasive rehabilitation: Technological improvements and their consequences. Neurosci Biobehav Rev 44:33–44CrossRef Christ O, Reiner M (2014) Perspectives and possible applications of the rubber hand and virtual hand illusion in non-invasive rehabilitation: Technological improvements and their consequences. Neurosci Biobehav Rev 44:33–44CrossRef
7.
go back to reference Endo S, Fröhner J, Music S, Hirche S, Beckerle P (2020) Effect of external force on agency in physical human-machine interaction. Front Human Neurosci 14 Endo S, Fröhner J, Music S, Hirche S, Beckerle P (2020) Effect of external force on agency in physical human-machine interaction. Front Human Neurosci 14
8.
go back to reference Fröhner J, Salvietti G, Beckerle P, Prattichizzo D (2018) Can wearable haptic devices foster the embodiment of virtual limbs? IEEE Transactions on Haptics 12(3):339–349CrossRef Fröhner J, Salvietti G, Beckerle P, Prattichizzo D (2018) Can wearable haptic devices foster the embodiment of virtual limbs? IEEE Transactions on Haptics 12(3):339–349CrossRef
9.
go back to reference Giummarra MJ, Gibson SJ, Georgiou-Karistianis N, Bradshaw JL (2008) Mechanisms underlying embodiment, disembodiment and loss of embodiment. Neurosci Biobehav Rev 32:143–160CrossRef Giummarra MJ, Gibson SJ, Georgiou-Karistianis N, Bradshaw JL (2008) Mechanisms underlying embodiment, disembodiment and loss of embodiment. Neurosci Biobehav Rev 32:143–160CrossRef
10.
go back to reference Holmes NP, Spence C (2004) The body schema and the multisensory representation(s) of peripersonal space. Cogn Process 5(2):94–105CrossRef Holmes NP, Spence C (2004) The body schema and the multisensory representation(s) of peripersonal space. Cogn Process 5(2):94–105CrossRef
11.
go back to reference Huynh TV, Bekrater-Bodmann R, Fröhner J, Vogt J, Beckerle P (2019) Robotic hand illusion with tactile feedback: Unravelling the relative contribution of visuotactile and visuomotor input to the representation of body parts in space. PloS one 14(1):e0210058CrossRef Huynh TV, Bekrater-Bodmann R, Fröhner J, Vogt J, Beckerle P (2019) Robotic hand illusion with tactile feedback: Unravelling the relative contribution of visuotactile and visuomotor input to the representation of body parts in space. PloS one 14(1):e0210058CrossRef
12.
go back to reference Kahl S, Wiese S, Russwinkel N, Kopp S (2022) Towards autonomous artificial agents with an active self: modeling sense of control in situated action. Cogn Syst Res 72:50–62CrossRef Kahl S, Wiese S, Russwinkel N, Kopp S (2022) Towards autonomous artificial agents with an active self: modeling sense of control in situated action. Cogn Syst Res 72:50–62CrossRef
13.
go back to reference Litwin P (2020) Extending bayesian models of the rubber hand illusion. Multisensory research 33(2):127–160CrossRef Litwin P (2020) Extending bayesian models of the rubber hand illusion. Multisensory research 33(2):127–160CrossRef
14.
go back to reference Longo MR, Schüür F, Kammers MPM, Tsakiris M, Haggard P (2008) What is embodiment? A psychometric approach. Cognition 107:978–998CrossRef Longo MR, Schüür F, Kammers MPM, Tsakiris M, Haggard P (2008) What is embodiment? A psychometric approach. Cognition 107:978–998CrossRef
15.
go back to reference Marr D (1982) Vision: A computational investigation into the human representation and processing of visual information. MIT Press Marr D (1982) Vision: A computational investigation into the human representation and processing of visual information. MIT Press
16.
go back to reference Mitchell HB. Multi-sensor data fusion: an introduction. Springer Science & Business Media Mitchell HB. Multi-sensor data fusion: an introduction. Springer Science & Business Media
17.
go back to reference Penner D, Abrams AMH, Overath P, Vogt J, Beckerle P (2019) Robotic leg illusion: System design and human-in-the-loop evaluation. IEEE Transactions on Human-Machine Systems 49(4):372–380CrossRef Penner D, Abrams AMH, Overath P, Vogt J, Beckerle P (2019) Robotic leg illusion: System design and human-in-the-loop evaluation. IEEE Transactions on Human-Machine Systems 49(4):372–380CrossRef
18.
go back to reference Pustejovsky J, Krishnaswamy N (2021) Embodied human computer interaction. KI-Künstliche Intelligenz 35(3):307–327CrossRef Pustejovsky J, Krishnaswamy N (2021) Embodied human computer interaction. KI-Künstliche Intelligenz 35(3):307–327CrossRef
19.
go back to reference Rognini G, Blanke O (2016) Cognetics: Robotic interfaces for the conscious mind. Trends Cogn Sci 20(3):162–164CrossRef Rognini G, Blanke O (2016) Cognetics: Robotic interfaces for the conscious mind. Trends Cogn Sci 20(3):162–164CrossRef
20.
go back to reference Rosenbaum DA (2009) Human motor control. Academic press Rosenbaum DA (2009) Human motor control. Academic press
21.
go back to reference Samad M, Chung AJ, Shams L (2015) Perception of body ownership is driven by bayesian sensory inference. PLoS ONE 10(2):e0117178CrossRef Samad M, Chung AJ, Shams L (2015) Perception of body ownership is driven by bayesian sensory inference. PLoS ONE 10(2):e0117178CrossRef
22.
go back to reference Schürmann T, Mohler BJ, Peters J, Beckerle P (2019) How cognitive models of human body experience might push robotics. Front Neurorobotics 13:14CrossRef Schürmann T, Mohler BJ, Peters J, Beckerle P (2019) How cognitive models of human body experience might push robotics. Front Neurorobotics 13:14CrossRef
23.
go back to reference Schürmann T, Vogt J, Christ O, Beckerle P (2019) The bayesian causal inference model benefits from an informed prior to predict proprioceptive drift in the rubber foot illusion. Cogn Process 20(4):447–457CrossRef Schürmann T, Vogt J, Christ O, Beckerle P (2019) The bayesian causal inference model benefits from an informed prior to predict proprioceptive drift in the rubber foot illusion. Cogn Process 20(4):447–457CrossRef
24.
go back to reference Toet A, Kuling IA, Krom BN, van Erp JBF (2020) Toward enhanced teleoperation through embodiment. Front Robotics and AI 7:14CrossRef Toet A, Kuling IA, Krom BN, van Erp JBF (2020) Toward enhanced teleoperation through embodiment. Front Robotics and AI 7:14CrossRef
25.
go back to reference Wilson M (2002) Six views of embodied cognition. Psychon Bull & Rev 9(4):625–636CrossRef Wilson M (2002) Six views of embodied cognition. Psychon Bull & Rev 9(4):625–636CrossRef
26.
go back to reference Wolpert DM (1997) Computational approaches to motor control. Trends cogn Sci 1(6):209–216CrossRef Wolpert DM (1997) Computational approaches to motor control. Trends cogn Sci 1(6):209–216CrossRef
27.
go back to reference Zednik C, Jakel F (2014) How does bayesian reverse-engineering work? In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol 36, no 36 Zednik C, Jakel F (2014) How does bayesian reverse-engineering work? In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol 36, no 36
Metadata
Title
Human-Robot Body Experience: An Artificial Intelligence Perspective
Author
Philipp Beckerle
Publication date
01-11-2022
Publisher
Springer Berlin Heidelberg
Published in
KI - Künstliche Intelligenz / Issue 3-4/2022
Print ISSN: 0933-1875
Electronic ISSN: 1610-1987
DOI
https://doi.org/10.1007/s13218-022-00779-1

Other articles of this Issue 3-4/2022

KI - Künstliche Intelligenz 3-4/2022 Go to the issue

Dissertation and Habilitation Abstracts

Habilitation Abstract: Towards Explainable Fact Checking

Technical Contribution

What’s on Your Mind, NICO?

Premium Partner