Skip to main content
Top

2018 | OriginalPaper | Chapter

Abstractive Document Summarization via Neural Model with Joint Attention

Authors : Liwei Hou, Po Hu, Chao Bei

Published in: Natural Language Processing and Chinese Computing

Publisher: Springer International Publishing

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Due to the difficulty of abstractive summarization, the great majority of past work on document summarization has been extractive, while the recent success of sequence-to-sequence framework has made abstractive summarization viable, in which a set of recurrent neural networks models based on attention encoder-decoder have achieved promising performance on short-text summarization tasks. Unfortunately, these attention encoder-decoder models often suffer from the undesirable shortcomings of generating repeated words or phrases and inability to deal with out-of-vocabulary words appropriately. To address these issues, in this work we propose to add an attention mechanism on output sequence to avoid repetitive contents and use the subword method to deal with the rare and unknown words. We applied our model to the public dataset provided by NLPCC 2017 shared task3. The evaluation results show that our system achieved the best ROUGE performance among all the participating teams and is also competitive with some state-of-the-art methods.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Radev, D.R., Jing, H., Stys, M., Tam, D.: Centroid-based summarization of multiple documents. Inf. Process. Manage. 40(6), 919–938 (2004)CrossRefMATH Radev, D.R., Jing, H., Stys, M., Tam, D.: Centroid-based summarization of multiple documents. Inf. Process. Manage. 40(6), 919–938 (2004)CrossRefMATH
2.
go back to reference Erkan, G., Radev, D.R.: LexPageRank: prestige in multi-document text summarization. In: EMNLP (2004) Erkan, G., Radev, D.R.: LexPageRank: prestige in multi-document text summarization. In: EMNLP (2004)
3.
go back to reference Wan, X., Yang, J., Xiao, J.: Manifold-ranking based topic-focused multi-document summarization. In: IJCAI (2007) Wan, X., Yang, J., Xiao, J.: Manifold-ranking based topic-focused multi-document summarization. In: IJCAI (2007)
4.
go back to reference Titov, I., McDonald, R.: A joint model of text and aspect ratings for sentiment summarization. In: ACL (2008) Titov, I., McDonald, R.: A joint model of text and aspect ratings for sentiment summarization. In: ACL (2008)
5.
go back to reference Li, S., Ouyang, Y., Wang, W., Sun, B.: Multi-document summarization using support vector regression. In: DUC (2007) Li, S., Ouyang, Y., Wang, W., Sun, B.: Multi-document summarization using support vector regression. In: DUC (2007)
6.
go back to reference Nishikawa, H., Arita, K., Tanaka, K., Hirao, T., Makino, T., Matsuo, Y.: Learning to generate coherent summary with discriminative hidden semi-Markov model. In: COLING (2014) Nishikawa, H., Arita, K., Tanaka, K., Hirao, T., Makino, T., Matsuo, Y.: Learning to generate coherent summary with discriminative hidden semi-Markov model. In: COLING (2014)
7.
go back to reference Gillick, D., Favre, B.: A scalable global model for summarization. In: ACL (2009) Gillick, D., Favre, B.: A scalable global model for summarization. In: ACL (2009)
8.
go back to reference Li, J., Li, L., Li, T.: Multi-document summarization via submodularity. Appl. Intell. 37(3), 420–430 (2012)CrossRef Li, J., Li, L., Li, T.: Multi-document summarization via submodularity. Appl. Intell. 37(3), 420–430 (2012)CrossRef
9.
go back to reference Lin, H., Bilmes, J.: Multi-document summarization via budgeted maximization of submodular functions. In: NAACL (2010) Lin, H., Bilmes, J.: Multi-document summarization via budgeted maximization of submodular functions. In: NAACL (2010)
10.
go back to reference Chopra, S., Auli, M., Rush, A.M.: Abstractive sentence summarization with attentive recurrent neural networks. In: NAACL (2016) Chopra, S., Auli, M., Rush, A.M.: Abstractive sentence summarization with attentive recurrent neural networks. In: NAACL (2016)
11.
go back to reference Wang, L., Ling, W.: Neural network-based abstract generation for opinions and arguments. In: NAACL (2016) Wang, L., Ling, W.: Neural network-based abstract generation for opinions and arguments. In: NAACL (2016)
12.
go back to reference Pennington, J., Socher, R., Manning, C.D.: Glove: global vectors for word representation. In: EMNLP (2014) Pennington, J., Socher, R., Manning, C.D.: Glove: global vectors for word representation. In: EMNLP (2014)
13.
go back to reference Bahdanau, D., Cho, K., Bengio, Y.: Neural Machine Translation by Jointly Learning to Align and Translate. arXiv preprint arXiv:1409.0473 (2014) Bahdanau, D., Cho, K., Bengio, Y.: Neural Machine Translation by Jointly Learning to Align and Translate. arXiv preprint arXiv:​1409.​0473 (2014)
14.
go back to reference Gu, J., Lu, Z., Li, H., Li, V.O.K.: Incorporating copying mechanism in sequence-to-sequence learning. In: ACL (2016) Gu, J., Lu, Z., Li, H., Li, V.O.K.: Incorporating copying mechanism in sequence-to-sequence learning. In: ACL (2016)
15.
16.
go back to reference Nallapati, R., Zhou, B., dos Santos, C., Gulcehre, C., Xiang, B.: Abstractive text summarization using sequence-to-sequence RNNs and Beyond. In: CoNLL (2016) Nallapati, R., Zhou, B., dos Santos, C., Gulcehre, C., Xiang, B.: Abstractive text summarization using sequence-to-sequence RNNs and Beyond. In: CoNLL (2016)
17.
18.
go back to reference Sennrich, R., Haddow, B., Birch, A.: Neural machine translation of rare words with subword units. In: ACL (2016) Sennrich, R., Haddow, B., Birch, A.: Neural machine translation of rare words with subword units. In: ACL (2016)
19.
go back to reference Lin, C.-Y.: Rouge: a package for automatic evaluation of summaries. In: ACL (2004) Lin, C.-Y.: Rouge: a package for automatic evaluation of summaries. In: ACL (2004)
20.
go back to reference Zhang, J., Wang, T., Wan, X.: PKUSUMSUM: a Java platform for multilingual document summarization. In: COLING (2016) Zhang, J., Wang, T., Wan, X.: PKUSUMSUM: a Java platform for multilingual document summarization. In: COLING (2016)
Metadata
Title
Abstractive Document Summarization via Neural Model with Joint Attention
Authors
Liwei Hou
Po Hu
Chao Bei
Copyright Year
2018
DOI
https://doi.org/10.1007/978-3-319-73618-1_28

Premium Partner