Skip to main content
Top

2020 | OriginalPaper | Chapter

Attentive Natural Language Generation from Abstract Meaning Representation

Authors : Radha Senthilkumar, S. Afrish Khan

Published in: New Trends in Computational Vision and Bio-inspired Computing

Publisher: Springer International Publishing

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Natural Language Generation takes a key role in presenting data as text or speech. The translation into a natural language from semantic representation is similar to Neural Machine Translation. We use a similar methodology known as Seq2Seq modelling for generating natural language. The usage of common semantic representation such as Abstract Meaning Representation allows adding naturalness to the generated sentences while being domain neutral. Recurrent Neural Network based autoencoder learns a hidden representation from semantic input which is then used to generate natural language. Long-Short Term Memory while in theory being capable of learning long-term dependencies fails to capture the correct information required for generation. We introduce attention mechanism as a resolution to improve capturing contextually important information. The resulting model has a significantly improved accuracy.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literature
2.
go back to reference Albert Gatt and Emiel Krahmer, “Survey of the State of the Art in Natural Language Generation: Core tasks, applications and evaluation”, 2018, Vol. 61, pp. 65–170. Albert Gatt and Emiel Krahmer, “Survey of the State of the Art in Natural Language Generation: Core tasks, applications and evaluation”, 2018, Vol. 61, pp. 65–170.
3.
go back to reference D. Chen and R. Mooney, “Learning to interpret natural language navigation instructions from observations,” in Proc. 25th AAAI Conf. Artificial Intelligence, San Francisco, CA, USA, August 2011, pp. 859–865. D. Chen and R. Mooney, “Learning to interpret natural language navigation instructions from observations,” in Proc. 25th AAAI Conf. Artificial Intelligence, San Francisco, CA, USA, August 2011, pp. 859–865.
4.
go back to reference Reiter, E., Sripada, S., Hunter, J. R., Yu, J., & Davy, I. “Choosing words in computer-generated weather forecasts”, Artificial Intelligence, 2005, 167 (1-2), 137–169. Reiter, E., Sripada, S., Hunter, J. R., Yu, J., & Davy, I. “Choosing words in computer-generated weather forecasts”, Artificial Intelligence, 2005, 167 (1-2), 137–169.
5.
go back to reference Bateman, J. A. “Enabling technology for multilingual natural language generation: the KPML development environment.” Natural Language Engineering, 1997, 3 (1), 15–55.CrossRef Bateman, J. A. “Enabling technology for multilingual natural language generation: the KPML development environment.” Natural Language Engineering, 1997, 3 (1), 15–55.CrossRef
6.
go back to reference Liang, P., Jordan, M. I., & Klein, D. “Learning Semantic Correspondences with Less Supervision.” In Proc. ACL-IJCNLP’09, 2009, pp. 91–99. Liang, P., Jordan, M. I., & Klein, D. “Learning Semantic Correspondences with Less Supervision.” In Proc. ACL-IJCNLP’09, 2009, pp. 91–99.
7.
go back to reference Oh, A. H., & Rudnicky, A. I. “Stochastic natural language generation for spoken dialog systems”. Computer Speech and Language, 16 (3-4), 2002, 387–407. Oh, A. H., & Rudnicky, A. I. “Stochastic natural language generation for spoken dialog systems”. Computer Speech and Language, 16 (3-4), 2002, 387–407.
8.
go back to reference Mairesse, F., & Young, S. (2014). “Stochastic language generation in dialogue using factored language models.” Computational Linguistics, 4 (4), 763–799. Mairesse, F., & Young, S. (2014). “Stochastic language generation in dialogue using factored language models.” Computational Linguistics, 4 (4), 763–799.
9.
go back to reference Sutskever, I., Vinyals, O., & Le, Q. V. “Sequence to sequence learning with neural networks.” In Proc. NIPS’14, 2014, pp. 3104–3112. Sutskever, I., Vinyals, O., & Le, Q. V. “Sequence to sequence learning with neural networks.” In Proc. NIPS’14, 2014, pp. 3104–3112.
10.
go back to reference Castro Ferreira, T., Calixto, I., Wubben, S., & Krahmer, E. “Linguistic realisation as machine translation: Comparing different MT models for AMR-to-text generation.” In Proc. INLG’17, 2017, pp. 1–10. Castro Ferreira, T., Calixto, I., Wubben, S., & Krahmer, E. “Linguistic realisation as machine translation: Comparing different MT models for AMR-to-text generation.” In Proc. INLG’17, 2017, pp. 1–10.
11.
go back to reference Hochreiter, S., & Urgen Schmidhuber, J. Long Short-Term Memory. Neural Computation, 1997, 9 (8), 1735–1780. Hochreiter, S., & Urgen Schmidhuber, J. Long Short-Term Memory. Neural Computation, 1997, 9 (8), 1735–1780.
12.
go back to reference Kelvin Xu, Jimmy Ba, Ryan Kiros, Kyunghyun Cho, Aaron Courville, Ruslan Salakhutdinov, Richard Zemel, Yoshua Bengio, “Show, Attend and Tell: Neural Image Caption Generation with Visual Attention”, arXiv:1502.03044, February 2015, pp. 1–12 Kelvin Xu, Jimmy Ba, Ryan Kiros, Kyunghyun Cho, Aaron Courville, Ruslan Salakhutdinov, Richard Zemel, Yoshua Bengio, “Show, Attend and Tell: Neural Image Caption Generation with Visual Attention”, arXiv:1502.03044, February 2015, pp. 1–12
14.
go back to reference Martín Abadi, Ashish Agarwal, Paul Barham, Eugene Brevdo, Zhifeng Chen, Craig Citro, Greg S. Corrado, Andy Davis, Jeffrey Dean, Matthieu Devin, Sanjay Ghemawat, Ian Goodfellow, Andrew Harp, Geoffrey Irving, Michael Isard, Rafal Jozefowicz, Yangqing Jia, Lukasz Kaiser, Manjunath Kudlur, Josh Levenberg, Dan Mané, Mike Schuster, Rajat Monga, Sherry Moore, Derek Murray, Chris Olah, Jonathon Shlens, Benoit Steiner, Ilya Sutskever, Kunal Talwar, Paul Tucker, Vincent Vanhoucke, Vijay Vasudevan, Fernanda Viégas, Oriol Vinyals, Pete Warden, Martin Wattenberg, Martin Wicke, Yuan Yu, and Xiaoqiang Zheng. TensorFlow: Large-scale machine learning on heterogeneous systems, 2015. Software available from tensorflow.org. Martín Abadi, Ashish Agarwal, Paul Barham, Eugene Brevdo, Zhifeng Chen, Craig Citro, Greg S. Corrado, Andy Davis, Jeffrey Dean, Matthieu Devin, Sanjay Ghemawat, Ian Goodfellow, Andrew Harp, Geoffrey Irving, Michael Isard, Rafal Jozefowicz, Yangqing Jia, Lukasz Kaiser, Manjunath Kudlur, Josh Levenberg, Dan Mané, Mike Schuster, Rajat Monga, Sherry Moore, Derek Murray, Chris Olah, Jonathon Shlens, Benoit Steiner, Ilya Sutskever, Kunal Talwar, Paul Tucker, Vincent Vanhoucke, Vijay Vasudevan, Fernanda Viégas, Oriol Vinyals, Pete Warden, Martin Wattenberg, Martin Wicke, Yuan Yu, and Xiaoqiang Zheng. TensorFlow: Large-scale machine learning on heterogeneous systems, 2015. Software available from tensorflow.​org.
15.
go back to reference M. MacMahon, B. Stankiewicz, and B. Kuipers, “Walk the talk: Connecting language knowledge, and action in route instructions,” in Proc. National Conf. Artificial Intelligence, Boston, Massachusetts, July 2006, pp. 1475–1482. M. MacMahon, B. Stankiewicz, and B. Kuipers, “Walk the talk: Connecting language knowledge, and action in route instructions,” in Proc. National Conf. Artificial Intelligence, Boston, Massachusetts, July 2006, pp. 1475–1482.
16.
go back to reference K. Papineni, S. Roukos, T. Ward, and W.-J. Zhu, “BLEU: A method for automatic evaluation of machine translation,” in Proc. 40th Annu. Meeting on Association for Computational Linguistics, Toulouse, France, July 2001, pp. 311–318. K. Papineni, S. Roukos, T. Ward, and W.-J. Zhu, “BLEU: A method for automatic evaluation of machine translation,” in Proc. 40th Annu. Meeting on Association for Computational Linguistics, Toulouse, France, July 2001, pp. 311–318.
Metadata
Title
Attentive Natural Language Generation from Abstract Meaning Representation
Authors
Radha Senthilkumar
S. Afrish Khan
Copyright Year
2020
Publisher
Springer International Publishing
DOI
https://doi.org/10.1007/978-3-030-41862-5_169

Premium Partner