Introduction
What is artificial empathy?
Empathy in the interpersonal domain
From empathy to artificial empathy
Constructs | Definitions |
---|---|
Artificial empathy | The codification of human cognitive and affective empathy through computational models in the design and implementation of AI agents |
Perspective-taking (in AI) | The computational learning and modeling of individuals’ thoughts and inference processes in a given situation. |
Empathic concern (in AI) | The algorithmic recognition of an individual’s distress and the creation of an impression of caring and concern from an AI agent to the individual. |
Emotional contagion (in AI) | The conveyance of an artificial sense or illusion of an AI agent experiencing the same emotions as the interacting party through emotion mirroring and mimicry. |
Customer experience | A consumer’s subjective responses to interactions with a firm and firm-related stimuli |
Affective customer experience | The experience of moods and emotions in response to interactions with a firm. |
Social customer experience | Consumers’ relational and social identity-related responses to interactions with a firm. |
Consumer emotional intelligence | A person’s ability to skillfully use emotional information to achieve a desired consumer outcome. |
Temporal proximity | The degree to which a consumer and an AI agent interact in simultaneity. |
Communication modality | The sensory channel through which AI agents are designed to interact with people. |
Situational complexity | The extent to which an interaction situation is complicated, which can be the result of task difficulty, environmental complexity, and other factors. |
Need for affect | A chronic tendency to approach (vs. avoid) emotional experiences. |
Instrumental vs. experiential contexts | An instrumental context is one in which achieving an end goal (e.g., resolving an unsatisfactory purchase) is the main objective. In comparison, an experiential context is one where individuals pursue an activity mainly for the sake of pursuing it and where the experience can be an end in itself. |
Functional competence | The ability of an AI agent to complete satisfactorily the primary task for an interaction. |
Speciesism | The assignment of different moral worth based on an individual’s species membership, which views humans as a superior species and discriminates against other non-human species. |
Authenticity | The judgment that something is genuine, real, and true. |
AI agency | The sense of an AI being autonomous, self-aware, and capable of purposeful actions. |
AI anthropomorphization | The design of AI agents to be more human-like using features such as a humanoid image, a human voice, or a human name. |
Brand anthropomorphism | The extent to which consumers attribute uniquely human characteristics and features to a brand. |
Situational involvement | The perceived importance and self-relevance of a particular situation. |
What are the dimensions of artificial empathy?
Perspective-taking
Empathic concern
Emotional contagion
The relationship among artificial empathy components
How does artificial empathy create value?
Artificial empathy and affective customer experience
-
P1 Artificial empathy will moderate the effect of agent type on affective customer experience quality such that the gap in affective customer experience quality between AI-enabled and human-based marketing interactions will be smaller at a higher level of artificial empathy.
Artificial empathy and social customer experience
-
P2 Artificial empathy will moderate the effect of agent type on social customer experience activation such that the gap in the activation of social customer experience between AI-enabled and human-based marketing interactions will be smaller at a higher level of artificial empathy.
-
P3 Artificial empathy will moderate the effect of agent type on social customer experience quality such that the gap in the quality of social customer experience between AI-enabled and human-based marketing interactions will be smaller at a higher level of artificial empathy.
The effect of artificial empathy on customer experience: A pilot study
Results
(a) Results for individual empathy dimensions (manipulation check) | |||
Low vs. High Empathy | Row Means | ||
Human | Perspective-Taking | 5.31 (1.41) vs. 5.77 (1.03), F1,519 = 7.02** | 5.55 (1.25) |
Empathic Concern | 5.52 (1.34) vs. 6.12 (.91), F1,519 = 15.94*** | 5.84 (1.17) | |
Emotional Contagion | 4.46 (1.46) vs. 5.01 (1.27), F1,519 = 10.12** | 4.75 (1.39) | |
AI | Perspective-Taking | 4.56 (1.64) vs. 5.47 (1.44), F1,519 = 24.69*** | 5.02 (1.60) |
Empathic Concern | 4.99 (1.48) vs. 5.86 (1.04), F1,519 = 29.31*** | 5.43 (1.35) | |
Emotional Contagion | 3.74 (1.69) vs. 4.75 (1.60), F1,519 = 27.53*** | 4.27 (1.72) | |
Column Means | Perspective-Taking | 4.96 (1.57) vs. 5.64 (1.24), F1,519 = 29.46*** | 5.31 (1.44) |
Empathic Concern | 5.27 (1.43) vs. 6.00 (.98), F1,519 = 44.34*** | 5.65 (1.27) | |
Emotional Contagion | 4.12 (1.61) vs. 4.90 (1.43), F1,519 = 35.88*** | 4.52 (1.57) | |
MANCOVA results: a significant empathy condition effect (F3, 517 = 12.30, p < .001), non-significant agent condition effect, and non-significant interaction; for the covariates, a significant age effect (F3, 517 = 2.77, p = .04) and a marginally significant gender effect (F3, 517 = 2.48, p = .06). | |||
(b) Results for the single-item empathy measure (manipulation check) | |||
Low vs. High Empathy | Row Means | ||
Human | 5.71 (1.32) vs. 6.08 (1.10), F1,519 = 4.76* | 5.90 (1.22) | |
AI | 4.70 (1.70) vs. 5.93 (1.24), F1,519 = 49.19*** | 5.32 (1.61) | |
Column Means | 5.23 (1.59) vs. 6.01 (1.17), F1,519 = 43.67*** | 5.64 (1.44) | |
ANCOVA results: a significant empathy condition effect (F1, 519 = 49.19, p < .001), non-significant agent condition effect, and a significant empathy*agent interaction (F1, 519 = 13.67, p < .001); neither age nor gender effect was significant. | |||
(c) Results for humanness perception (manipulation check) | |||
Human vs. AI | Row Means | ||
Low Empathy | 6.23 (1.11) vs. 4.52 (1.69), F1,519 = 106.30*** | 5.43 (1.65) | |
High Empathy | 6.41 (.94) vs. 5.37 (1.51), F1,519 = 41.88*** | 5.99 (1.32) | |
Column Means | 6.33 (1.02) vs. 4.95 (1.65), F1,519 = 141.74*** | 5.70 (1.51) | |
ANCOVA results: significant effects of agent condition (F1, 519 = 41.88, p < .001), empathy condition (F1, 519 = 23.97, p < .001), and their two-way interaction (F1, 519 = 8.44, p = .004); neither age nor gender effect was significant. | |||
(d) Results for customer experience (CX) outcomes | |||
Human vs. AI | Row Means | ||
Low-Empathy | Affective CX Quality | 6.14 (.97) vs. 5.65 (1.03), F1,519 = 16.12*** | 5.91 (1.02) |
Social CX Activation | 5.44 (1.43) vs. 4.62 (1.64), F1,519 = 19.72*** | 5.06 (1.58) | |
Social CX Quality | 5.68 (1.24) vs. 4.60 (1.62), F1,519 = 43.83*** | 5.17 (1.53) | |
High-Empathy | Affective CX Quality | 6.10 (.98) vs. 5.96 (1.01), F1,519 = 1.49n.s. | 6.04 (.99) |
Social CX Activation | 5.55 (1.29) vs. 5.33 (1.49), F1,519 = 1.81n.s. | 5.45 (1.39) | |
Social CX Quality | 5.74 (1.06) vs. 5.59 (1.31), F1,519 = .97n.s. | 5.67 (1.18) | |
Column Means | Affective CX Quality | 6.12 (.97) vs. 5.80 (1.03), F1,519 = 13.92*** | 5.98 (1.01) |
Social CX Activation | 5.50 (1.36) vs. 4.98 (1.60), F1,519 = 17.01*** | 5.26 (1.50) | |
Social CX Quality | 5.71 (1.15) vs. 5.10 (1.55), F1,519 = 29.59*** | 5.43 (1.38) | |
MANCOVA results: a significant empathy condition main effect (Wilks’ Λ = .93, F3,517 = 12.02, p < .001), a non-significant agent condition main effect, and a significant empathy*agent condition interaction (Wilks’ Λ = .97, F3,517 = 5.88, p < .001); neither age nor gender effect was significant. |
When does artificial empathy create value?
Affective customer experience
-
P4 The ability of artificial empathy to bridge the human-AI gap in affective customer experience is contingent on the availability of high-quality emotional signals. Quality emotional signals are more likely when the consumer possesses high emotional intelligence, when the interaction is synchronous and voice-based, and when the situation is moderately complex.
-
P5 The ability of artificial empathy to create value by bridging the human-AI affective experience gap depends on consumers’ need for affect. Artificial empathy benefits high need-for-affect consumers but is unnecessary or even detrimental to low need-for-affect consumers.
-
P6 The ability of artificial empathy to create value by bridging the human-AI affective experience gap is higher in an experiential context than in an instrumental, goal-oriented context.
-
P7 In an instrumental, goal-oriented context, the ability of artificial empathy to create value by bridging the human-AI affective experience gap depends on the AI’s functional competence.
Social customer experience
-
P8 The ability of artificial empathy to bridge the human-AI gap in social customer experience is contingent on consumers’ willingness to accept AI as social partners. Artificial empathy can have a detrimental effect for consumers with a high level of speciesism.
-
P9 The ability of artificial empathy to bridge the human-AI gap in social customer experience is contingent on the perceived authenticity of the experience. Authenticity is affected by both the quality of empathy implementation and the perceived autonomy and agency of the AI agent.
-
P10 The ability of artificial empathy to bridge the human-AI gap in social customer experience is contingent on the anthropomorphic design of the AI agent. Artificial empathy is most suitable when paired with a moderately (as opposed to low or high) anthropomorphic AI agent.
-
P11 The ability of artificial empathy to create value by bridging the human-AI social experience gap is contingent on situational involvement. Artificial empathy creates more value in a high-involvement context but is unnecessary and possibly detrimental in a low-involvement context.
-
P12 The ability of artificial empathy to create value by bridging the human-AI social experience gap is higher for brands with high anthropomorphism than for those with low anthropomorphism.