skip to main content
research-article
Open Access

Enabling robots to understand indirect speech acts in task-based interactions

Published:26 May 2017Publication History
Skip Abstract Section

Abstract

An important open problem for enabling truly taskable robots is the lack of task-general natural language mechanisms within cognitive robot architectures that enable robots to understand typical forms of human directives and generate appropriate responses. In this paper, we first provide experimental evidence that humans tend to phrase their directives to robots indirectly, especially in socially conventionalized contexts. We then introduce pragmatic and dialogue-based mechanisms to infer intended meanings from such indirect speech acts and demonstrate that these mechanisms can handle all indirect speech acts found in our experiment as well as other common forms of requests.

References

  1. Allen, J. F., Byron, D. K., Dzikovska, M., Ferguson, G., Galescu, L., & Stent, A. (2001). Toward Conversational Human-Computer Interaction. AI Magazine, 22, 27--37. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Baxter, P., Kennedy, J., Senft, E., Lemaignan, S., & Belpaeme, T. (2016). From characterising three years of HRI to methodology and reporting recommendations. In the Eleventh ACM/IEEE International Conference on Human Robot Interation (pp. 391--398). Christchurch, NZ. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Bratman, M. (1987). Intention, plans, and practical reason. Stanford, CA: Center for the Study of Language and Information.Google ScholarGoogle Scholar
  4. Briggs, G., & Scheutz, M. (2011). Facilitating Mental Modeling in Collaborative Human-Robot Interaction through Adverbial Cues. In Proceedings of the sigdial 2011 conference (pp. 239--247). Portland, Oregon: Association for Computational Linguistics. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Briggs, G., & Scheutz, M. (2013). A hybrid architectural approach to understanding and appropriately generating indirect speech acts. In Proceedings of the Twenth-Seventh AAAI Conference on Artificial Intelligence (pp. 1213--1219). Bellevue, WA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Briggs, G., & Scheutz, M. (2014). How robots can affect human behavior: Investigating the effects of robotic displays of protest and distress. International Journal of Social Robotics, 6, 343--355.Google ScholarGoogle ScholarCross RefCross Ref
  7. Briggs, G., & Scheutz, M. (2015). "Sorry, I can't do that": Developing mechanisms to appropriately reject directives in human-robot interactions. In AAAI Fall Symposium Series: Artificial Intelligence for Human-Robot Interaction (pp. 32--36). Arlington, VA.Google ScholarGoogle Scholar
  8. Briggs, G., & Scheutz, M. (2016). The pragmatic social robot: Toward socially-sensitive utterance generation in human-robot interactions. In AAAI Fall Symposium Series: Artificial Intelligence for Human-Robot Interaction (pp. 12--15). Arlington, VA.Google ScholarGoogle Scholar
  9. Brown, P., & Levenson, S. C. (1987). Politeness: Some universals in language usage. Cambridge, MA: Cambridge University Press.Google ScholarGoogle Scholar
  10. Clark, H. H. (1996). Using language. New York, NY: Cambridge University Press.Google ScholarGoogle Scholar
  11. Clark, H. H., & Schunk, D. H. (1980). Polite responses to polite requests. Cognition, 8, 111--143.Google ScholarGoogle ScholarCross RefCross Ref
  12. Gibbs Jr, R. W. (1986). What makes some indirect speech acts conventional? Journal of Memory and Language, 25, 181--196.Google ScholarGoogle ScholarCross RefCross Ref
  13. Gibbs Jr., R. W., & Mueller, R. A. (1988). Conversational sequences and preference for indirect speech acts. Discourse Processes, 11, 101--116.Google ScholarGoogle ScholarCross RefCross Ref
  14. Gomez, R., Kawahara, T., Nakamura, K., & Nakadai, K. (2012). Multi-party human-robot interaction with distant-talking speech recognition. In Proceedings of the Seventh Annual ACM/IEEE International Conference on Human-Robot Interaction (pp. 439--446). Boston, MA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Gopalan, N., & Tellex, S. (2015). Modeling and solving human-robot collaborative tasks using POMDPs. In Robotics: Science and Systems: Workshop on Model Learning for Human-Robot Communication. Rome, Italy.Google ScholarGoogle Scholar
  16. Hinds, P. J., Roberts, T. L., & Jones, H. (2004). Whose job is it anyway? A study of human-robot interaction in a collaborative task. Human-Computer Interaction, 19, 151--181. &2_7. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Hinkelman, E. A., & Allen, J. F. (1989). Two constraints on speech act ambiguity. In Proceedings of the 27th Annual Meeting on Association for Computational Linguistics (pp. 212--219). Vancouver, Canada. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Lemaignan, S., Ros, R., Sisbot, E. A., Alami, R., & Beetz, M. (2012). Grounding the interaction: Anchoring situated discourse in everyday human-robot interaction. International Journal of Social Robotics, 4, 181--199.Google ScholarGoogle ScholarCross RefCross Ref
  19. Perrault, C. R., & Allen, J. F. (1980). A plan-based analysis of indirect speech acts. Computational Linguistics, 6, 167--182. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Rich, C., Ponsler, B., Holroyd, A., & Sidner, C. L. (2010). Recognizing engagement in human-robot interaction. In Proceedings of the Fifth ACM/IEEE International Conference on Human-Robot Interaction (pp. 375--382). Osaka, Japan. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Saon, G., Sercu, T., Rennie, S. J., & Kuo, H. J. (2016). The IBM 2016 English Conversational Telephone Speech Recognition System. CoRR, abs/1604.08242, 1--5.Google ScholarGoogle Scholar
  22. Schermerhorn, P., Kramer, J. F., Middendorff, C., & Scheutz, M. (2006). DIARC: A testbed for natural human-robot interaction. In Proceedings of the AAAI Conference on Artificial Intelligence (pp. 1972--1973). Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Scheutz, M., Briggs, G., Cantrell, R., Krause, E., Williams, T., & Veale, R. (2013). Novel mechanisms for natural human-robot interactions in the DIARC architecture. In Proceedings of the AAAI Workshop on Intelligent Robotic Systems.Google ScholarGoogle Scholar
  24. Scheutz, M., Schermerhorn, P., Kramer, J., & Anderson, D. (2007). First steps toward natural human-like HRI. Autonomous Robots, 22, 411--423. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Schlöder, J. J. (2014). Uptake, clarification and argumentation. Unpublished master's thesis, Universiteit van Amsterdam.Google ScholarGoogle Scholar
  26. Schuirmann, D. (1981). On hypothesis-testing to determine if the mean of a normal-distribution is contained in a known interval. Biometrics, 37(3), 617--617.Google ScholarGoogle Scholar
  27. Searle, J. R. (1975). Indirect speech acts. Syntax and Semantics, 3, 59--82.Google ScholarGoogle Scholar
  28. Searle, J. R. (1976). A classification of illocutionary acts. Language in Society, 5, 1--23.Google ScholarGoogle ScholarCross RefCross Ref
  29. Shafer, G. (1976). A mathematical theory of evidence. Princeton, NJ: Princeton University Press.Google ScholarGoogle Scholar
  30. She, L., Yang, S., Cheng, Y., Jia, Y., Chai, J. Y., & Xi, N. (2014). Back to the blocks world: Learning new actions through situated human-robot dialogue. In Proceedings of the Fifteenth Annual Meeting of the Special Interest Group on Discourse and Dialogue (pp. 89--97). Philadelphia, PA.Google ScholarGoogle ScholarCross RefCross Ref
  31. Tellex, S., Thaker, P., Deits, R., Simeonov, D., Kollar, T., & Roy, N. (2013). Toward information theoretic human-robot dialog. In Robotics: Science and Systems VIII (pp. 409--416). Cambridge, MA: The MIT Press.Google ScholarGoogle Scholar
  32. Warnier, M., Guitton, J., Lemaignan, S., & Alami, R. (2012). When the robot puts itself in your shoes. Managing and exploiting human and robot beliefs. In Proceedings of the 21st IEEE International Symposium on Robot and Human Interaction Communication (pp. 948--954). Paris, France.Google ScholarGoogle ScholarCross RefCross Ref
  33. Westlake, W. (1981). Bioequivalence testing--a need to rethink. Biometrics, 37, 589--594.Google ScholarGoogle ScholarCross RefCross Ref
  34. Williams, T., Briggs, G., Oosterveld, B., & Scheutz, M. (2015). Going beyond command-based instructions: Extending robotic natural language interaction capabilities. In Proceedings of AAAI Conference on Artificial Intelligence (pp. 1388--1393). Austin, TX. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Wilske, S., & Kruijff, G.-J. M. (2006). Service robots dealing with indirect speech acts. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (pp. 4698--4703). Beijing, China.Google ScholarGoogle ScholarCross RefCross Ref
  36. Young, S., Gašić, M., Keizer, S., Mairesse, F., Schatzmann, J., Thomson, B., & Yu, K. (2010). The hidden information state model: A practical framework for POMDP-based spoken dialogue management. Computer Speech & Language, 24, 150--174. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Enabling robots to understand indirect speech acts in task-based interactions
      Index terms have been assigned to the content through auto-classification.

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in

      Full Access

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader