Skip to main content
Top

2020 | OriginalPaper | Chapter

Learning Interactions at Multiple Levels for Abstractive Multi-document Summarization

Authors : Yiding Liu, Xiaoning Fan, Jie Zhou, Gongshen Liu

Published in: Neural Information Processing

Publisher: Springer International Publishing

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

The biggest obstacles facing multi-document summarization include much more complicated input and excessive redundancy in source contents. Most state-of-the-art systems have attempted to tackle the redundancy problem, treating the entire input as a flat sequence. However, correlations among documents are often neglected. In this paper, we propose an end-to-end summarization model called MLT, which can effectively learn interactions at multiple levels and avoid redundant information. Specifically, we utilize a word-level transformer layer to encode contextual information within each sentence. Also, we design a sentence-level transformer layer for learning relations between sentences within a single document, as well as a document-level layer for learning interactions among input documents. Moreover, we use a neural method to enhance Max Marginal Relevance (MMR), a powerful algorithm for redundancy reduction. We incorporate MMR into our model and measure the redundancy quantitively based on the sentence representations. On benchmark datasets, our system compares favorably to strong summarization baselines judged by automatic metrics and human evaluators.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Antognini, D., Faltings, B.: Learning to create sentence semantic relation graphs for multi-document summarization. EMNLP-IJCNLP 2019, 32 (2019) Antognini, D., Faltings, B.: Learning to create sentence semantic relation graphs for multi-document summarization. EMNLP-IJCNLP 2019, 32 (2019)
2.
go back to reference Carbonell, J.G., Goldstein, J.: The use of MMR, diversity-based reranking for reordering documents and producing summaries. In: SIGIR, pp. 335–336. ACM (1998) Carbonell, J.G., Goldstein, J.: The use of MMR, diversity-based reranking for reordering documents and producing summaries. In: SIGIR, pp. 335–336. ACM (1998)
3.
go back to reference Cho, S., Lebanoff, L., Foroosh, H., Liu, F.: Improving the similarity measure of determinantal point processes for extractive multi-document summarization. In: ACL, vol. 1, pp. 1027–1038. Association for Computational Linguistics (2019) Cho, S., Lebanoff, L., Foroosh, H., Liu, F.: Improving the similarity measure of determinantal point processes for extractive multi-document summarization. In: ACL, vol. 1, pp. 1027–1038. Association for Computational Linguistics (2019)
4.
go back to reference Chu, E., Liu, P.J.: MeanSum: a neural model for unsupervised multi-document abstractive summarization. In: ICML. Proceedings of Machine Learning Research, vol. 97, pp. 1223–1232. PMLR (2019) Chu, E., Liu, P.J.: MeanSum: a neural model for unsupervised multi-document abstractive summarization. In: ICML. Proceedings of Machine Learning Research, vol. 97, pp. 1223–1232. PMLR (2019)
5.
go back to reference Erkan, G., Radev, D.R.: LexRank: graph-based lexical centrality as salience in text summarization. J. Artif. Intell. Res. 22, 457–479 (2004)CrossRef Erkan, G., Radev, D.R.: LexRank: graph-based lexical centrality as salience in text summarization. J. Artif. Intell. Res. 22, 457–479 (2004)CrossRef
6.
go back to reference Fabbri, A.R., Li, I., She, T., Li, S., Radev, D.R.: Multi-news: a large-scale multi-document summarization dataset and abstractive hierarchical model. In: ACL, vol. 1, pp. 1074–1084. Association for Computational Linguistics (2019) Fabbri, A.R., Li, I., She, T., Li, S., Radev, D.R.: Multi-news: a large-scale multi-document summarization dataset and abstractive hierarchical model. In: ACL, vol. 1, pp. 1074–1084. Association for Computational Linguistics (2019)
7.
go back to reference Gehrmann, S., Deng, Y., Rush, A.M.: Bottom-up abstractive summarization. In: EMNLP, pp. 4098–4109. Association for Computational Linguistics (2018) Gehrmann, S., Deng, Y., Rush, A.M.: Bottom-up abstractive summarization. In: EMNLP, pp. 4098–4109. Association for Computational Linguistics (2018)
8.
go back to reference Hao, T.: Overview of DUC 2005. In: Proceedings of the Document Understanding Conference (DUC 2005) (2005) Hao, T.: Overview of DUC 2005. In: Proceedings of the Document Understanding Conference (DUC 2005) (2005)
9.
go back to reference Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. In: ICLR (Poster) (2015) Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. In: ICLR (Poster) (2015)
10.
go back to reference Kiritchenko, S., Mohammad, S.: Best-worst scaling more reliable than rating scales: a case study on sentiment intensity annotation. In: ACL, vol. 2, pp. 465–470. Association for Computational Linguistics (2017) Kiritchenko, S., Mohammad, S.: Best-worst scaling more reliable than rating scales: a case study on sentiment intensity annotation. In: ACL, vol. 2, pp. 465–470. Association for Computational Linguistics (2017)
11.
go back to reference Lebanoff, L., Song, K., Liu, F.: Adapting the neural encoder-decoder framework from single to multi-document summarization. In: EMNLP, pp. 4131–4141. Association for Computational Linguistics (2018) Lebanoff, L., Song, K., Liu, F.: Adapting the neural encoder-decoder framework from single to multi-document summarization. In: EMNLP, pp. 4131–4141. Association for Computational Linguistics (2018)
12.
go back to reference Lin, C.Y.: ROUGE: a package for automatic evaluation of summaries. In: Text Summarization Branches Out, pp. 74–81 (2004) Lin, C.Y.: ROUGE: a package for automatic evaluation of summaries. In: Text Summarization Branches Out, pp. 74–81 (2004)
13.
go back to reference Liu, P.J., et al.: Generating Wikipedia by summarizing long sequences. In: ICLR (Poster). OpenReview.net (2018) Liu, P.J., et al.: Generating Wikipedia by summarizing long sequences. In: ICLR (Poster). OpenReview.net (2018)
14.
go back to reference Liu, Y., Lapata, M.: Hierarchical transformers for multi-document summarization. In: ACL, vol. 1, pp. 5070–5081. Association for Computational Linguistics (2019) Liu, Y., Lapata, M.: Hierarchical transformers for multi-document summarization. In: ACL, vol. 1, pp. 5070–5081. Association for Computational Linguistics (2019)
15.
go back to reference Louviere, J.J., Flynn, T.N., Marley, A.A.J.: Best-Worst Scaling: Theory, Methods and Applications. Cambridge University Press, Cambridge (2015) Louviere, J.J., Flynn, T.N., Marley, A.A.J.: Best-Worst Scaling: Theory, Methods and Applications. Cambridge University Press, Cambridge (2015)
16.
go back to reference Mihalcea, R., Tarau, P.: TextRank: bringing order into text. In: EMNLP, pp. 404–411. ACL (2004) Mihalcea, R., Tarau, P.: TextRank: bringing order into text. In: EMNLP, pp. 404–411. ACL (2004)
17.
go back to reference See, A., Liu, P.J., Manning, C.D.: Get to the point: summarization with pointer-generator networks. In: ACL, vol. 1, pp. 1073–1083. Association for Computational Linguistics (2017) See, A., Liu, P.J., Manning, C.D.: Get to the point: summarization with pointer-generator networks. In: ACL, vol. 1, pp. 1073–1083. Association for Computational Linguistics (2017)
18.
go back to reference Vaswani, A., et al.: Attention is all you need. In: NIPS, pp. 5998–6008 (2017) Vaswani, A., et al.: Attention is all you need. In: NIPS, pp. 5998–6008 (2017)
19.
go back to reference Yasunaga, M., Zhang, R., Meelu, K., Pareek, A., Srinivasan, K., Radev, D.R.: Graph-based neural multi-document summarization. In: CoNLL, pp. 452–462. Association for Computational Linguistics (2017) Yasunaga, M., Zhang, R., Meelu, K., Pareek, A., Srinivasan, K., Radev, D.R.: Graph-based neural multi-document summarization. In: CoNLL, pp. 452–462. Association for Computational Linguistics (2017)
20.
go back to reference Zhang, J., Tan, J., Wan, X.: Adapting neural single-document summarization model for abstractive multi-document summarization: a pilot study. In: INLG, pp. 381–390. Association for Computational Linguistics (2018) Zhang, J., Tan, J., Wan, X.: Adapting neural single-document summarization model for abstractive multi-document summarization: a pilot study. In: INLG, pp. 381–390. Association for Computational Linguistics (2018)
Metadata
Title
Learning Interactions at Multiple Levels for Abstractive Multi-document Summarization
Authors
Yiding Liu
Xiaoning Fan
Jie Zhou
Gongshen Liu
Copyright Year
2020
DOI
https://doi.org/10.1007/978-3-030-63820-7_77

Premium Partner