Abstract
An important open problem for enabling truly taskable robots is the lack of task-general natural language mechanisms within cognitive robot architectures that enable robots to understand typical forms of human directives and generate appropriate responses. In this paper, we first provide experimental evidence that humans tend to phrase their directives to robots indirectly, especially in socially conventionalized contexts. We then introduce pragmatic and dialogue-based mechanisms to infer intended meanings from such indirect speech acts and demonstrate that these mechanisms can handle all indirect speech acts found in our experiment as well as other common forms of requests.
- Allen, J. F., Byron, D. K., Dzikovska, M., Ferguson, G., Galescu, L., & Stent, A. (2001). Toward Conversational Human-Computer Interaction. AI Magazine, 22, 27--37. Google ScholarDigital Library
- Baxter, P., Kennedy, J., Senft, E., Lemaignan, S., & Belpaeme, T. (2016). From characterising three years of HRI to methodology and reporting recommendations. In the Eleventh ACM/IEEE International Conference on Human Robot Interation (pp. 391--398). Christchurch, NZ. Google ScholarDigital Library
- Bratman, M. (1987). Intention, plans, and practical reason. Stanford, CA: Center for the Study of Language and Information.Google Scholar
- Briggs, G., & Scheutz, M. (2011). Facilitating Mental Modeling in Collaborative Human-Robot Interaction through Adverbial Cues. In Proceedings of the sigdial 2011 conference (pp. 239--247). Portland, Oregon: Association for Computational Linguistics. Google ScholarDigital Library
- Briggs, G., & Scheutz, M. (2013). A hybrid architectural approach to understanding and appropriately generating indirect speech acts. In Proceedings of the Twenth-Seventh AAAI Conference on Artificial Intelligence (pp. 1213--1219). Bellevue, WA. Google ScholarDigital Library
- Briggs, G., & Scheutz, M. (2014). How robots can affect human behavior: Investigating the effects of robotic displays of protest and distress. International Journal of Social Robotics, 6, 343--355.Google ScholarCross Ref
- Briggs, G., & Scheutz, M. (2015). "Sorry, I can't do that": Developing mechanisms to appropriately reject directives in human-robot interactions. In AAAI Fall Symposium Series: Artificial Intelligence for Human-Robot Interaction (pp. 32--36). Arlington, VA.Google Scholar
- Briggs, G., & Scheutz, M. (2016). The pragmatic social robot: Toward socially-sensitive utterance generation in human-robot interactions. In AAAI Fall Symposium Series: Artificial Intelligence for Human-Robot Interaction (pp. 12--15). Arlington, VA.Google Scholar
- Brown, P., & Levenson, S. C. (1987). Politeness: Some universals in language usage. Cambridge, MA: Cambridge University Press.Google Scholar
- Clark, H. H. (1996). Using language. New York, NY: Cambridge University Press.Google Scholar
- Clark, H. H., & Schunk, D. H. (1980). Polite responses to polite requests. Cognition, 8, 111--143.Google ScholarCross Ref
- Gibbs Jr, R. W. (1986). What makes some indirect speech acts conventional? Journal of Memory and Language, 25, 181--196.Google ScholarCross Ref
- Gibbs Jr., R. W., & Mueller, R. A. (1988). Conversational sequences and preference for indirect speech acts. Discourse Processes, 11, 101--116.Google ScholarCross Ref
- Gomez, R., Kawahara, T., Nakamura, K., & Nakadai, K. (2012). Multi-party human-robot interaction with distant-talking speech recognition. In Proceedings of the Seventh Annual ACM/IEEE International Conference on Human-Robot Interaction (pp. 439--446). Boston, MA. Google ScholarDigital Library
- Gopalan, N., & Tellex, S. (2015). Modeling and solving human-robot collaborative tasks using POMDPs. In Robotics: Science and Systems: Workshop on Model Learning for Human-Robot Communication. Rome, Italy.Google Scholar
- Hinds, P. J., Roberts, T. L., & Jones, H. (2004). Whose job is it anyway? A study of human-robot interaction in a collaborative task. Human-Computer Interaction, 19, 151--181. &2_7. Google ScholarDigital Library
- Hinkelman, E. A., & Allen, J. F. (1989). Two constraints on speech act ambiguity. In Proceedings of the 27th Annual Meeting on Association for Computational Linguistics (pp. 212--219). Vancouver, Canada. Google ScholarDigital Library
- Lemaignan, S., Ros, R., Sisbot, E. A., Alami, R., & Beetz, M. (2012). Grounding the interaction: Anchoring situated discourse in everyday human-robot interaction. International Journal of Social Robotics, 4, 181--199.Google ScholarCross Ref
- Perrault, C. R., & Allen, J. F. (1980). A plan-based analysis of indirect speech acts. Computational Linguistics, 6, 167--182. Google ScholarDigital Library
- Rich, C., Ponsler, B., Holroyd, A., & Sidner, C. L. (2010). Recognizing engagement in human-robot interaction. In Proceedings of the Fifth ACM/IEEE International Conference on Human-Robot Interaction (pp. 375--382). Osaka, Japan. Google ScholarDigital Library
- Saon, G., Sercu, T., Rennie, S. J., & Kuo, H. J. (2016). The IBM 2016 English Conversational Telephone Speech Recognition System. CoRR, abs/1604.08242, 1--5.Google Scholar
- Schermerhorn, P., Kramer, J. F., Middendorff, C., & Scheutz, M. (2006). DIARC: A testbed for natural human-robot interaction. In Proceedings of the AAAI Conference on Artificial Intelligence (pp. 1972--1973). Google ScholarDigital Library
- Scheutz, M., Briggs, G., Cantrell, R., Krause, E., Williams, T., & Veale, R. (2013). Novel mechanisms for natural human-robot interactions in the DIARC architecture. In Proceedings of the AAAI Workshop on Intelligent Robotic Systems.Google Scholar
- Scheutz, M., Schermerhorn, P., Kramer, J., & Anderson, D. (2007). First steps toward natural human-like HRI. Autonomous Robots, 22, 411--423. Google ScholarDigital Library
- Schlöder, J. J. (2014). Uptake, clarification and argumentation. Unpublished master's thesis, Universiteit van Amsterdam.Google Scholar
- Schuirmann, D. (1981). On hypothesis-testing to determine if the mean of a normal-distribution is contained in a known interval. Biometrics, 37(3), 617--617.Google Scholar
- Searle, J. R. (1975). Indirect speech acts. Syntax and Semantics, 3, 59--82.Google Scholar
- Searle, J. R. (1976). A classification of illocutionary acts. Language in Society, 5, 1--23.Google ScholarCross Ref
- Shafer, G. (1976). A mathematical theory of evidence. Princeton, NJ: Princeton University Press.Google Scholar
- She, L., Yang, S., Cheng, Y., Jia, Y., Chai, J. Y., & Xi, N. (2014). Back to the blocks world: Learning new actions through situated human-robot dialogue. In Proceedings of the Fifteenth Annual Meeting of the Special Interest Group on Discourse and Dialogue (pp. 89--97). Philadelphia, PA.Google ScholarCross Ref
- Tellex, S., Thaker, P., Deits, R., Simeonov, D., Kollar, T., & Roy, N. (2013). Toward information theoretic human-robot dialog. In Robotics: Science and Systems VIII (pp. 409--416). Cambridge, MA: The MIT Press.Google Scholar
- Warnier, M., Guitton, J., Lemaignan, S., & Alami, R. (2012). When the robot puts itself in your shoes. Managing and exploiting human and robot beliefs. In Proceedings of the 21st IEEE International Symposium on Robot and Human Interaction Communication (pp. 948--954). Paris, France.Google ScholarCross Ref
- Westlake, W. (1981). Bioequivalence testing--a need to rethink. Biometrics, 37, 589--594.Google ScholarCross Ref
- Williams, T., Briggs, G., Oosterveld, B., & Scheutz, M. (2015). Going beyond command-based instructions: Extending robotic natural language interaction capabilities. In Proceedings of AAAI Conference on Artificial Intelligence (pp. 1388--1393). Austin, TX. Google ScholarDigital Library
- Wilske, S., & Kruijff, G.-J. M. (2006). Service robots dealing with indirect speech acts. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (pp. 4698--4703). Beijing, China.Google ScholarCross Ref
- Young, S., Gašić, M., Keizer, S., Mairesse, F., Schatzmann, J., Thomson, B., & Yu, K. (2010). The hidden information state model: A practical framework for POMDP-based spoken dialogue management. Computer Speech & Language, 24, 150--174. Google ScholarDigital Library
Index Terms
- Enabling robots to understand indirect speech acts in task-based interactions
Recommendations
"Thank You for Sharing that Interesting Fact!": Effects of Capability and Context on Indirect Speech Act Use in Task-Based Human-Robot Dialogue
HRI '18: Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot InteractionNaturally interacting robots must be able to understand natural human speech. As such, recent work has sought to allow robots to infer the intentions behind commonly used non-literal utterances such as indirect speech acts (ISAs). However, it is still ...
Rhetorical robots: making robots more effective speakers using linguistic cues of expertise
HRI '13: Proceedings of the 8th ACM/IEEE international conference on Human-robot interactionRobots hold great promise as informational assistants such as museum guides, information booth attendants, concierges, shopkeepers, and more. In such positions, people will expect them to be experts on their area of specialty. Not only will robots need ...
Speech acts redux: Beyond request-response interactions
CUI '20: Proceedings of the 2nd Conference on Conversational User InterfacesWe communicate to (1) express how we feel, (2) share observations about the world, (3) commit to future acts, (4) request others to do things, and (5) change the state of the world according to pragmatics. Of these categories, today's conversational ...
Comments