Skip to main content
Erschienen in: Neural Computing and Applications 7/2024

06.12.2023 | Original Article

An entity-guided text summarization framework with relational heterogeneous graph neural network

verfasst von: Jingqiang Chen

Erschienen in: Neural Computing and Applications | Ausgabe 7/2024

Einloggen

Aktivieren Sie unsere intelligente Suche um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Two of the most crucial issues for text summarization to generate faithful summaries are to make use of knowledge beyond text and to make use of cross-sentence relations in text. Intuitive ways for the two issues are knowledge graph (KG) and graph neural network (GNN), respectively. Entities are semantic units in text and in KG. This paper focuses on both issues by leveraging entities mentioned in text to connect GNN and KG for summarization. Firstly, entities are leveraged to construct a sentence-entity graph with weighted multi-type edges to model sentence relations, and a relational heterogeneous GNN for summarization is proposed to calculate node encodings. Secondly, entities are leveraged to link the graph to KG to collect knowledge. Thirdly, entities guide a two-step summarization framework defining a multitask selector to select salient sentences and entities, and using an entity-focused abstractor to compress the sentences. GNN is connected with KG by constructing sentence-entity graphs where entity–entity edges are built based on KG, initializing entity embeddings on KG, and training entity embeddings using entity–entity edges. The relational heterogeneous GNN utilizes both edge weights and edge types in GNN to calculate graphs with weighted multi-type edges. Experiments show the proposed method outperforms extractive baselines including the HGNN-based HGNNSum and abstractive baselines including the entity-driven SENECA on CNN/DM, and outperforms most baselines on NYT50. Experiments on sub-datasets show the density of sentence-entity edges greatly influences the performance of the proposed method. The greater the density, the better the performance. Ablations show effectiveness of the method.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Zhuge H (2016) Multi-dimensional summarization in cyber-physical society. Morgan Kaufmann Zhuge H (2016) Multi-dimensional summarization in cyber-physical society. Morgan Kaufmann
2.
Zurück zum Zitat Wang D, Liu P, Zheng Y, et al. (2020) Heterogeneous graph neural networks for extractive document summarization. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 6209–6219. Wang D, Liu P, Zheng Y, et al. (2020) Heterogeneous graph neural networks for extractive document summarization. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 6209–6219.
3.
Zurück zum Zitat Sharma E, Huang L, Hu Z, et al. (2019) An entity-driven framework for abstractive summarization. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), 3271–3282. Sharma E, Huang L, Hu Z, et al. (2019) An entity-driven framework for abstractive summarization. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), 3271–3282.
4.
Zurück zum Zitat Cheng J, Lapata M (2016) Neural summarization by extracting sentences and words. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 484–494. Cheng J, Lapata M (2016) Neural summarization by extracting sentences and words. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 484–494.
5.
Zurück zum Zitat Nallapati R, Zhai F, Zhou B (2017) Summarunner: a recurrent neural network based sequence model for extractive summarization of documents. In: Proceedings of the AAAI Conference on Artificial Intelligence, 3075–3081. Nallapati R, Zhai F, Zhou B (2017) Summarunner: a recurrent neural network based sequence model for extractive summarization of documents. In: Proceedings of the AAAI Conference on Artificial Intelligence, 3075–3081.
6.
Zurück zum Zitat Liu Y, Lapata M (2019) Hierarchical transformers for multi-document summarization. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 5070–5081. Liu Y, Lapata M (2019) Hierarchical transformers for multi-document summarization. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 5070–5081.
7.
Zurück zum Zitat Li H, Zhu J, Zhang J, et al. (2018) Ensure the correctness of the summary: incorporate entailment knowledge into abstractive sentence summarization. In: Proceedings of the 27th International Conference on Computational Linguistics. 2018: 1430–1441. Li H, Zhu J, Zhang J, et al. (2018) Ensure the correctness of the summary: incorporate entailment knowledge into abstractive sentence summarization. In: Proceedings of the 27th International Conference on Computational Linguistics. 2018: 1430–1441.
8.
Zurück zum Zitat Feng X, Feng X, Qin B. (2021) Incorporating commonsense knowledge into abstractive dialogue summarization via heterogeneous graph networks. In: China National Conference on Chinese Computational Linguistics. Springer, Cham, 2021: 127–142. Feng X, Feng X, Qin B. (2021) Incorporating commonsense knowledge into abstractive dialogue summarization via heterogeneous graph networks. In: China National Conference on Chinese Computational Linguistics. Springer, Cham, 2021: 127–142.
9.
Zurück zum Zitat Li W, Xu J, He Y, et al (2019) Coherent comments generation for chinese articles with a graph-to-sequence model. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 4843–4852. Li W, Xu J, He Y, et al (2019) Coherent comments generation for chinese articles with a graph-to-sequence model. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 4843–4852.
10.
Zurück zum Zitat Gunel B, Zhu C, Zeng M, et al (2020) Mind The Facts: knowledge-boosted coherent abstractive Text summarization. arXiv preprint arXiv:​2006.​15435. Gunel B, Zhu C, Zeng M, et al (2020) Mind The Facts: knowledge-boosted coherent abstractive Text summarization. arXiv preprint arXiv:​2006.​15435.
11.
Zurück zum Zitat Zhang Z, Han X, Liu Z, et al (2019) ERNIE: Enhanced language representation with informative entities. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 1441–1451. Zhang Z, Han X, Liu Z, et al (2019) ERNIE: Enhanced language representation with informative entities. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 1441–1451.
12.
Zurück zum Zitat Liu W, Zhou P, Zhao Z, et al (2020) K-bert: enabling language representation with knowledge graph. In: Proceedings of the AAAI Conference on Artificial Intelligence, pp 2901–2908. Liu W, Zhou P, Zhao Z, et al (2020) K-bert: enabling language representation with knowledge graph. In: Proceedings of the AAAI Conference on Artificial Intelligence, pp 2901–2908.
13.
Zurück zum Zitat Erkan G, Radev DR (2004) Lexrank: graph-based lexical centrality as salience in text summarization. J ARTIF INTELL RES 22:457–479 CrossRef Erkan G, Radev DR (2004) Lexrank: graph-based lexical centrality as salience in text summarization. J ARTIF INTELL RES 22:457–479 CrossRef
14.
Zurück zum Zitat Mihalcea R, Tarau P (2004) Textrank: bringing order into text. In: Proceedings of the 2004 conference on empirical methods in natural language processing, 404–411. Mihalcea R, Tarau P (2004) Textrank: bringing order into text. In: Proceedings of the 2004 conference on empirical methods in natural language processing, 404–411.
16.
Zurück zum Zitat Zhao L, Xu W, Guo J. (2020) Improving abstractive dialogue summarization with graph structures and topic words. In: Proceedings of the 28th International Conference on Computational Linguistics. 2020: 437–449. Zhao L, Xu W, Guo J. (2020) Improving abstractive dialogue summarization with graph structures and topic words. In: Proceedings of the 28th International Conference on Computational Linguistics. 2020: 437–449.
17.
Zurück zum Zitat Yasunaga M, Zhang R, Meelu K, et al (2017) Graph-based Neural Multi-Document Summarization. In: Proceedings of the 21st Conference on Computational Natural Language Learning (CoNLL), 452–462. Yasunaga M, Zhang R, Meelu K, et al (2017) Graph-based Neural Multi-Document Summarization. In: Proceedings of the 21st Conference on Computational Natural Language Learning (CoNLL), 452–462.
18.
Zurück zum Zitat Xu J, Gan Z, Cheng Y, et al (2020) Discourse-Aware Neural Extractive Text Summarization. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 5021–5031. Xu J, Gan Z, Cheng Y, et al (2020) Discourse-Aware Neural Extractive Text Summarization. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 5021–5031.
19.
Zurück zum Zitat Schlichtkrull M, Kipf TN, Bloem P et al (2018) Modeling relational data with graph convolutional networks. European semantic web conference. Springer, Cham, pp 593–607 CrossRef Schlichtkrull M, Kipf TN, Bloem P et al (2018) Modeling relational data with graph convolutional networks. European semantic web conference. Springer, Cham, pp 593–607 CrossRef
20.
Zurück zum Zitat Hoffart J, Yosef MA, Bordino I, et al (2011) Robust disambiguation of named entities in text. In: Proceedings of the 2011 Conference on Empirical Methods in Natural Language Processing, 782–792. Hoffart J, Yosef MA, Bordino I, et al (2011) Robust disambiguation of named entities in text. In: Proceedings of the 2011 Conference on Empirical Methods in Natural Language Processing, 782–792.
21.
Zurück zum Zitat Nguyen DB, Hoffart J, Theobald M et al (2014) AIDA-light: high-throughput named-entity disambiguation. Workshop on Linked Data on the Web 2014:1–10 Nguyen DB, Hoffart J, Theobald M et al (2014) AIDA-light: high-throughput named-entity disambiguation. Workshop on Linked Data on the Web 2014:1–10
22.
Zurück zum Zitat Cao Y, Hou L, Li J, et al (2018) Neural collective entity linking. In: Proceedings of the 27th International Conference on Computational Linguistics, 675–686. Cao Y, Hou L, Li J, et al (2018) Neural collective entity linking. In: Proceedings of the 27th International Conference on Computational Linguistics, 675–686.
23.
Zurück zum Zitat Ristoski P, Rosati J, Di Noia T et al (2019) RDF2Vec: RDF graph embeddings and their applications. SEMANT WEB 10(4):721–752 CrossRef Ristoski P, Rosati J, Di Noia T et al (2019) RDF2Vec: RDF graph embeddings and their applications. SEMANT WEB 10(4):721–752 CrossRef
24.
Zurück zum Zitat Nenkova A (2008) Entity-driven rewrite for multi-document summarization. In: Proceedings of the Third International Joint Conference on Natural Language Processing, 118–125. Nenkova A (2008) Entity-driven rewrite for multi-document summarization. In: Proceedings of the Third International Joint Conference on Natural Language Processing, 118–125.
26.
Zurück zum Zitat Parveen D, Ramsl HM, Strube M (2015) Topical coherence for graph-based extractive summarization. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, 1949–1954. Parveen D, Ramsl HM, Strube M (2015) Topical coherence for graph-based extractive summarization. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, 1949–1954.
27.
Zurück zum Zitat Huang L, Wu L, Wang L. (2020) Knowledge Graph-Augmented Abstractive Summarization with Semantic-Driven Cloze Reward. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. 2020: 5094–5107. Huang L, Wu L, Wang L. (2020) Knowledge Graph-Augmented Abstractive Summarization with Semantic-Driven Cloze Reward. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. 2020: 5094–5107.
28.
Zurück zum Zitat Zhuge H (2020) Cyber-physical-social intelligence on human-machine-nature symbiosis. Springer CrossRef Zhuge H (2020) Cyber-physical-social intelligence on human-machine-nature symbiosis. Springer CrossRef
29.
Zurück zum Zitat Bordes A, Usunier N, Garcia-Durán A, et al (2013) Translating embeddings for modeling multi-relational data. In: Proceedings of the 26th International Conference on Neural Information Processing, 2787–2795. Bordes A, Usunier N, Garcia-Durán A, et al (2013) Translating embeddings for modeling multi-relational data. In: Proceedings of the 26th International Conference on Neural Information Processing, 2787–2795.
30.
Zurück zum Zitat Vaswani A, Shazeer N, Parmar N, et al (2017) Attention is all you need. In: Proceedings of the 31st International Conference on Neural Information Processing Systems, 6000–6010. Vaswani A, Shazeer N, Parmar N, et al (2017) Attention is all you need. In: Proceedings of the 31st International Conference on Neural Information Processing Systems, 6000–6010.
31.
Zurück zum Zitat Devlin J, Chang MW, Lee K, et al (2019) BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), 4171–4186. Devlin J, Chang MW, Lee K, et al (2019) BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), 4171–4186.
33.
Zurück zum Zitat Gilmer J, Schoenholz SS, Riley PF, et al (2017) Neural message passing for quantum chemistry. In: International Conference on Machine Learning, PMLR, 1263–1272. Gilmer J, Schoenholz SS, Riley PF, et al (2017) Neural message passing for quantum chemistry. In: International Conference on Machine Learning, PMLR, 1263–1272.
35.
Zurück zum Zitat Zhang C, Song D, Huang C, et al (2019) Heterogeneous graph neural network. In: Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 793–803. Zhang C, Song D, Huang C, et al (2019) Heterogeneous graph neural network. In: Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 793–803.
37.
Zurück zum Zitat Wang S, Chen Z, Li D, et al (2019) Attentional heterogeneous graph neural network: application to program reidentification. In: Proceedings of the 2019 SIAM International Conference on Data Mining. Society for Industrial and Applied Mathematics, 693–701. Wang S, Chen Z, Li D, et al (2019) Attentional heterogeneous graph neural network: application to program reidentification. In: Proceedings of the 2019 SIAM International Conference on Data Mining. Society for Industrial and Applied Mathematics, 693–701.
38.
Zurück zum Zitat Hamilton WL, Ying R, Leskovec J (2017) Inductive representation learning on large graphs. In: Proceedings of the 31st International Conference on Neural Information Processing Systems, 1025–1035. Hamilton WL, Ying R, Leskovec J (2017) Inductive representation learning on large graphs. In: Proceedings of the 31st International Conference on Neural Information Processing Systems, 1025–1035.
39.
Zurück zum Zitat Wang, X, Ji H, Shi C, Wang B, Ye Y, Cui P, Yu PS (2019) Heterogeneous graph attention network. In: The World Wide Web Conference, 2022–2032. Wang, X, Ji H, Shi C, Wang B, Ye Y, Cui P, Yu PS (2019) Heterogeneous graph attention network. In: The World Wide Web Conference, 2022–2032.
42.
Zurück zum Zitat Xiao W, Carenini G (2019) Extractive summarization of long documents by combining global and local context. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), 3002–3012. Xiao W, Carenini G (2019) Extractive summarization of long documents by combining global and local context. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), 3002–3012.
43.
Zurück zum Zitat Zhong M, Liu P, Wang D, et al (2019) Searching for effective neural extractive summarization: what works and what’s next. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 1049–1058. Zhong M, Liu P, Wang D, et al (2019) Searching for effective neural extractive summarization: what works and what’s next. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 1049–1058.
45.
Zurück zum Zitat Chen YC, Bansal M (2018) Fast abstractive summarization with reinforce-selected sentence rewriting. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics 1: 675–686. Chen YC, Bansal M (2018) Fast abstractive summarization with reinforce-selected sentence rewriting. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics 1: 675–686.
46.
Zurück zum Zitat Li C, Xu W, Li S, et al (2018) Guiding generation for abstractive text summarization based on key information guide network. In: Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2: 55–60. Li C, Xu W, Li S, et al (2018) Guiding generation for abstractive text summarization based on key information guide network. In: Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2: 55–60.
47.
Zurück zum Zitat Gehrmann S, Deng Y, Rush AM (2018) Bottom-up abstractive summarization. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, 4098–4109. Gehrmann S, Deng Y, Rush AM (2018) Bottom-up abstractive summarization. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, 4098–4109.
48.
Zurück zum Zitat Bahdanau D, Cho K, Bengio Y (2014) Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:​1409.​0473. Bahdanau D, Cho K, Bengio Y (2014) Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:​1409.​0473.
49.
Zurück zum Zitat See A, Liu PJ, Manning CD (2017) Get to the point: summarization with pointer-generator networks. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics 1: 1073–1083. See A, Liu PJ, Manning CD (2017) Get to the point: summarization with pointer-generator networks. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics 1: 1073–1083.
51.
Zurück zum Zitat Manning CD, Surdeanu M, Bauer J, et al (2014) The Stanford CoreNLP natural language processing toolkit. In: Proceedings of 52nd annual meeting of the association for computational linguistics: system demonstrations, 55–60. Manning CD, Surdeanu M, Bauer J, et al (2014) The Stanford CoreNLP natural language processing toolkit. In: Proceedings of 52nd annual meeting of the association for computational linguistics: system demonstrations, 55–60.
52.
Zurück zum Zitat Rennie SJ, Marcheret E, Mroueh Y, et al (2017) Self-critical sequence training for image captioning. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 7008–7024. Rennie SJ, Marcheret E, Mroueh Y, et al (2017) Self-critical sequence training for image captioning. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 7008–7024.
53.
Zurück zum Zitat Lin CY (2004) Rouge: a package for automatic evaluation of summaries. Text summarization branches out, 74–81. Lin CY (2004) Rouge: a package for automatic evaluation of summaries. Text summarization branches out, 74–81.
54.
Zurück zum Zitat Hermann KM, Kočiský T, Grefenstette E, et al (2015) Teaching machines to read and comprehend. In: Proceedings of the 28th International Conference on Neural Information Processing Systems-1: 1693–1701. Hermann KM, Kočiský T, Grefenstette E, et al (2015) Teaching machines to read and comprehend. In: Proceedings of the 28th International Conference on Neural Information Processing Systems-1: 1693–1701.
55.
Zurück zum Zitat Durrett G, Berg-Kirkpatrick T, Klein D (2016) Learning-Based Single-Document Summarization with Compression and Anaphoricity Constraints. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, 1: 1998–2008. Durrett G, Berg-Kirkpatrick T, Klein D (2016) Learning-Based Single-Document Summarization with Compression and Anaphoricity Constraints. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, 1: 1998–2008.
56.
Zurück zum Zitat Sandhaus E (2008) The new york times annotated corpus. Linguistic Data Consortium, Philadelphia Sandhaus E (2008) The new york times annotated corpus. Linguistic Data Consortium, Philadelphia
57.
Zurück zum Zitat Mikolov T, Sutskever I, Chen K, et al (2013) Distributed representations of words and phrases and their compositionality. arXiv preprint arXiv:​1310.​4546. Mikolov T, Sutskever I, Chen K, et al (2013) Distributed representations of words and phrases and their compositionality. arXiv preprint arXiv:​1310.​4546.
59.
Zurück zum Zitat Liu Y, Lapata M (2019) Text Summarization with Pretrained Encoders. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), 3721–3731. Liu Y, Lapata M (2019) Text Summarization with Pretrained Encoders. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), 3721–3731.
60.
Zurück zum Zitat Xu J, Durrett G (2019) Neural Extractive Text Summarization with Syntactic Compression. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), 3283–3294. Xu J, Durrett G (2019) Neural Extractive Text Summarization with Syntactic Compression. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), 3283–3294.
61.
Zurück zum Zitat Luo L, Ao X, Song Y, et al (2019) Reading like HER: Human reading inspired extractive summarization. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, 3024–3034. Luo L, Ao X, Song Y, et al (2019) Reading like HER: Human reading inspired extractive summarization. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, 3024–3034.
63.
Zurück zum Zitat Liu Y, Titov I, Lapata M (2019) Single document summarization as tree induction. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 1: 1745–1755. Liu Y, Titov I, Lapata M (2019) Single document summarization as tree induction. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 1: 1745–1755.
Metadaten
Titel
An entity-guided text summarization framework with relational heterogeneous graph neural network
verfasst von
Jingqiang Chen
Publikationsdatum
06.12.2023
Verlag
Springer London
Erschienen in
Neural Computing and Applications / Ausgabe 7/2024
Print ISSN: 0941-0643
Elektronische ISSN: 1433-3058
DOI
https://doi.org/10.1007/s00521-023-09247-9

Weitere Artikel der Ausgabe 7/2024

Neural Computing and Applications 7/2024 Zur Ausgabe