2012 | OriginalPaper | Chapter
The Illusion of Agency: The Influence of the Agency of an Artificial Agent on Its Persuasive Power
Authors : Cees Midden, Jaap Ham
Published in: Persuasive Technology. Design for Health and Safety
Publisher: Springer Berlin Heidelberg
Activate our intelligent search to find suitable subject content or patents.
Select sections of text to find matching patents with Artificial Intelligence. powered by
Select sections of text to find additional relevant content using AI-assisted search. powered by
Artificial social agents can influence people. However, artificial social agents are not real humans, and people may ascribe less agency to them. Would the persuasive power of a social robot diminish when people ascribe only little agency to it? To investigate this question, we performed an experiment in which participants performed tasks on a washing machine and received feedback from a robot about their energy consumption (e.g., “Your energy consumption is too high”), or factual, non-social feedback. This robot was introduced to participants as (a) an avatar (that was controlled a human in all its feedback actions; high agency), or as (b) an autonomous robot (that controlled its own feedback actions; moderate agency), or as (c) a robot that produced only random feedback; low agency). Results indicated that participants consumed less energy when a robotic social agent gave them feedback than when they received non-social feedback. This behavioral effect was independent of the level of robotic agency. In contrast, a perceived agency measure indicated that the random feedback robot was ascribed the lowest agency rating. These results suggest that the persuasive power of robot behavior is independent of the extent to which the persuadee explicitly ascribes agency to the agent.