Skip to main content
Erschienen in: International Journal of Machine Learning and Cybernetics 7/2022

24.01.2022 | Original Article

A joint extraction model of entities and relations based on relation decomposition

verfasst von: Chen Gao, Xuan Zhang, Hui Liu, Wei Yun, Jiahao Jiang

Erschienen in: International Journal of Machine Learning and Cybernetics | Ausgabe 7/2022

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Extracting entities and relations from unstructured text is an essential task in the field of information extraction. Existing work mainly pipeline extraction and joint decoding methods. However, these methods are unable to extract overlapping entities and relations, and ignore the task correlation between entity and relation extraction. In this paper, we first introduce the BERT pre-training model to model the text more finely. Then, we decompose the extraction into relation extraction and entity recognition. Relation extraction is transformed into a relation classification task. Entity recognition is transformed into a sequence labeling task. The recognition entity includes a head entity and a tail entity. We evaluate the model on the New York Times (NYT) and WebNLG datasets. Compared with most existing models, excellent results have been obtained. Experimental results show that our model can fully capture the semantic interdependence between the two tasks of entity and relation extraction, reduce the interference of unrelated entity pairs, and effectively solve the problem of entity overlap.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Weitere Produktempfehlungen anzeigen
Literatur
1.
Zurück zum Zitat Zelenko D, Aone C, Richardella A (2003) Kernel methods for relation extraction. J Mach Learn Rese 3:1083–1106MathSciNetMATH Zelenko D, Aone C, Richardella A (2003) Kernel methods for relation extraction. J Mach Learn Rese 3:1083–1106MathSciNetMATH
2.
Zurück zum Zitat Chan YS, Roth D (2011) Exploiting syntactico-semantic structures for relation extraction. In: Proceedings of the 49th annual meeting of the association for computational linguistics, pp. 551–560 Chan YS, Roth D (2011) Exploiting syntactico-semantic structures for relation extraction. In: Proceedings of the 49th annual meeting of the association for computational linguistics, pp. 551–560
3.
Zurück zum Zitat Nadeau D, Sekine S (2007) A survey of named entity recognition and classification. Lingvist Investig 30:3–26CrossRef Nadeau D, Sekine S (2007) A survey of named entity recognition and classification. Lingvist Investig 30:3–26CrossRef
4.
Zurück zum Zitat Bach N, Badaskar S (2007) A review of relation extraction. Lit Rev Lang Stat 2:1–15 Bach N, Badaskar S (2007) A review of relation extraction. Lit Rev Lang Stat 2:1–15
5.
Zurück zum Zitat Li Q, Ji H (2014) Incremental joint extraction of entity mentions and relations. In: Proceedings of the 52nd annual meeting of the association for computational linguistics, pp. 402–312 Li Q, Ji H (2014) Incremental joint extraction of entity mentions and relations. In: Proceedings of the 52nd annual meeting of the association for computational linguistics, pp. 402–312
6.
Zurück zum Zitat Miwa M, Sasaki Y (2014) Modeling joint entity and relation extraction with table representation. In: Proceedings of the 2014 conference on empirical methods in natural language processing, pp. 1858–1869 Miwa M, Sasaki Y (2014) Modeling joint entity and relation extraction with table representation. In: Proceedings of the 2014 conference on empirical methods in natural language processing, pp. 1858–1869
7.
Zurück zum Zitat Ren X, Wu Z, He W, Qu M, Voss CR, Ji H, (2017) Cotype: joint extraction of typed entities and relations with knowledge bases. In: Proceedings of the 26th international conference on world wide web, pp. 1015–1024 Ren X, Wu Z, He W, Qu M, Voss CR, Ji H, (2017) Cotype: joint extraction of typed entities and relations with knowledge bases. In: Proceedings of the 26th international conference on world wide web, pp. 1015–1024
8.
Zurück zum Zitat Yu X, Lam W (2010) Jointly identifying entities and extracting relations in encyclopedia text via a graphical model approach. In: Coling 2010: posters, pp. 1399–1407 Yu X, Lam W (2010) Jointly identifying entities and extracting relations in encyclopedia text via a graphical model approach. In: Coling 2010: posters, pp. 1399–1407
10.
Zurück zum Zitat Zheng S, Wang F, Bao H, Yue XH, Peng Z, Bo X (2017) Joint extraction of entities and relations based on a novel tagging scheme. arXiv preprint arXiv:1706.05075 Zheng S, Wang F, Bao H, Yue XH, Peng Z, Bo X (2017) Joint extraction of entities and relations based on a novel tagging scheme. arXiv preprint arXiv:​1706.​05075
11.
Zurück zum Zitat Miwa M, Bansal M (2016) End-to-end relation extraction using lstms on sequences and tree structures. In: Proceedings of the fifty-fourth annual meeting of the association for computational linguistics, p. 1105-1116 Miwa M, Bansal M (2016) End-to-end relation extraction using lstms on sequences and tree structures. In: Proceedings of the fifty-fourth annual meeting of the association for computational linguistics, p. 1105-1116
12.
Zurück zum Zitat Li F, Zhang M, Fu G, Ji D (2017) A neural joint model for entity and relation extraction from biomedical text. Physica A 18:1–11 Li F, Zhang M, Fu G, Ji D (2017) A neural joint model for entity and relation extraction from biomedical text. Physica A 18:1–11
14.
Zurück zum Zitat Katiyar A, Cardie C (2017) Going out on a limb: joint extraction of entity mentions and relations without dependency trees. In: Proceedings of the 55th annual meeting of the association for computational linguistics, pp. 917–928 Katiyar A, Cardie C (2017) Going out on a limb: joint extraction of entity mentions and relations without dependency trees. In: Proceedings of the 55th annual meeting of the association for computational linguistics, pp. 917–928
15.
Zurück zum Zitat Zeng XR, Zeng DJ, He SZ, Liu K, Jun Z (2018) Extracting relational facts by an end-to-end neural model with copy mechanism. In: Proceedings of the 56th annual meeting of the association for computational linguistics, pp. 506–514 Zeng XR, Zeng DJ, He SZ, Liu K, Jun Z (2018) Extracting relational facts by an end-to-end neural model with copy mechanism. In: Proceedings of the 56th annual meeting of the association for computational linguistics, pp. 506–514
16.
Zurück zum Zitat Zhong Z, Chen D (2021) A frustratingly easy approach for entity and relation extraction. In: Proceedings of the 2021 conference of the North American chapter of the Association for Computational Linguistics: human language technologies Zhong Z, Chen D (2021) A frustratingly easy approach for entity and relation extraction. In: Proceedings of the 2021 conference of the North American chapter of the Association for Computational Linguistics: human language technologies
17.
Zurück zum Zitat Lafferty J, McCallum A, Pereira FCN (2001) Conditional random fields: probabilistic models for segmenting and labeling sequence data. In: ICML ’01: Proceedings of the eighteenth international conference on machine learning. ACM, pp 282–289 Lafferty J, McCallum A, Pereira FCN (2001) Conditional random fields: probabilistic models for segmenting and labeling sequence data. In: ICML ’01: Proceedings of the eighteenth international conference on machine learning. ACM, pp 282–289
18.
Zurück zum Zitat Lample G, Ballesteros M, Subramanian S (2016) Neural architectures for named entity recognition. In: Proceedings of the 2016 conference of the North American chapter of the Association for Computational Linguistics: human language technologies Lample G, Ballesteros M, Subramanian S (2016) Neural architectures for named entity recognition. In: Proceedings of the 2016 conference of the North American chapter of the Association for Computational Linguistics: human language technologies
19.
Zurück zum Zitat Xu K, Feng Y, Huang S, Zhao D (2015a) Semantic relation classification via convolutional neural networks with simple negative sampling. In: Proceedings of the 2015 conference on empirical methods in natural language processing, p. 536–540 Xu K, Feng Y, Huang S, Zhao D (2015a) Semantic relation classification via convolutional neural networks with simple negative sampling. In: Proceedings of the 2015 conference on empirical methods in natural language processing, p. 536–540
20.
Zurück zum Zitat Zeng D, Liu K, Lai S, Zhou G, Zhao J (2014) Relation classification via convolutional deep neural network. In: Proceedings of the twenty-fifth COLING international conference, p. 2335-2344 Zeng D, Liu K, Lai S, Zhou G, Zhao J (2014) Relation classification via convolutional deep neural network. In: Proceedings of the twenty-fifth COLING international conference, p. 2335-2344
21.
Zurück zum Zitat Xu Y, Mou L, Li G, Chen Y, Peng H, Jin Z (2015b) Classifying relations via long short term memory networks along shortest dependency paths. In: Proceedings of the 2015 conference on empirical methods in natural language processing, p. 1785–1794 Xu Y, Mou L, Li G, Chen Y, Peng H, Jin Z (2015b) Classifying relations via long short term memory networks along shortest dependency paths. In: Proceedings of the 2015 conference on empirical methods in natural language processing, p. 1785–1794
22.
Zurück zum Zitat Zhou P, Shi W, Tian J, Qi Z, Li B, Hao H (2016) Attention-based bidirectional long short-term memory networks for relation classification. In: Proceedings of the 54th annual meeting of the association for computational linguistics, pp. 207–212 Zhou P, Shi W, Tian J, Qi Z, Li B, Hao H (2016) Attention-based bidirectional long short-term memory networks for relation classification. In: Proceedings of the 54th annual meeting of the association for computational linguistics, pp. 207–212
23.
Zurück zum Zitat Zhang T, Lin H, Tadesse M, Ren Y, Duan X, Xu B (2020) Chinese medical relation extraction based on multi-hop self-attention mechanism. Int J Mach Learn Cybern 12:355–363CrossRef Zhang T, Lin H, Tadesse M, Ren Y, Duan X, Xu B (2020) Chinese medical relation extraction based on multi-hop self-attention mechanism. Int J Mach Learn Cybern 12:355–363CrossRef
24.
Zurück zum Zitat Zhou Y, Huang L, Guo T, Hu S, Han J (2019) An attention-based model for joint extraction of entities and relations with implicit entity features. In: Proceedings of the 2019 world wide web conference, p. 729-737 Zhou Y, Huang L, Guo T, Hu S, Han J (2019) An attention-based model for joint extraction of entities and relations with implicit entity features. In: Proceedings of the 2019 world wide web conference, p. 729-737
25.
Zurück zum Zitat Meng Z, Tian S, Yu L, Lv Y (2020) Joint extraction of entities and relations based on character graph convolutional network and multi-head self-attention mechanism. J Exp Theor Artif Intell 33(2):349–362CrossRef Meng Z, Tian S, Yu L, Lv Y (2020) Joint extraction of entities and relations based on character graph convolutional network and multi-head self-attention mechanism. J Exp Theor Artif Intell 33(2):349–362CrossRef
26.
Zurück zum Zitat Luo L, Yang Z, Cao M, Wang L, Zhang Y (2020) A neural network-based joint learning approach for biomedical entity and relation extraction from biomedical literature. J Biomed Inform 103:103384CrossRef Luo L, Yang Z, Cao M, Wang L, Zhang Y (2020) A neural network-based joint learning approach for biomedical entity and relation extraction from biomedical literature. J Biomed Inform 103:103384CrossRef
27.
Zurück zum Zitat Bekoulis G, Deleu J, Demeester T (2018) Joint entity recognition and relation extraction as a multi-head selection problem. Expert Syst Appl 114:34–45CrossRef Bekoulis G, Deleu J, Demeester T (2018) Joint entity recognition and relation extraction as a multi-head selection problem. Expert Syst Appl 114:34–45CrossRef
28.
Zurück zum Zitat Huang W, Cheng X, Wang T, Chu W (2019) Bert-based multi-head selection for joint entity-relation extraction. In: CCF international conference on natural language processing and chinese computing, pp. 713–723 Huang W, Cheng X, Wang T, Chu W (2019) Bert-based multi-head selection for joint entity-relation extraction. In: CCF international conference on natural language processing and chinese computing, pp. 713–723
29.
Zurück zum Zitat Zeng X, He S, Zeng D (2019) Learning the extraction order of multiple relational facts in a sentence with reinforcement learning. In: Proceedings of the 2019 conference on empirical methods in natural language processing and the 9th international joint conference on natural language processing, pp. 367–377 Zeng X, He S, Zeng D (2019) Learning the extraction order of multiple relational facts in a sentence with reinforcement learning. In: Proceedings of the 2019 conference on empirical methods in natural language processing and the 9th international joint conference on natural language processing, pp. 367–377
30.
Zurück zum Zitat Dai D, Xiao X, Lyu Y, Dou S, She Q, Wang H (2019) Joint extraction of entities and overlapping relations using position-attentive sequence labeling. In: Proceedings of the AAAI conference on artificial intelligence, pp. 6300–6308 Dai D, Xiao X, Lyu Y, Dou S, She Q, Wang H (2019) Joint extraction of entities and overlapping relations using position-attentive sequence labeling. In: Proceedings of the AAAI conference on artificial intelligence, pp. 6300–6308
31.
Zurück zum Zitat Zeng D, Zhang H, Liu Q (2020) Copymtl: Copy mechanism for joint extraction of entities and relations with multi-task learning. In: Proceedings of the AAAI conference on artificial intelligence, pp. 9507–9514 Zeng D, Zhang H, Liu Q (2020) Copymtl: Copy mechanism for joint extraction of entities and relations with multi-task learning. In: Proceedings of the AAAI conference on artificial intelligence, pp. 9507–9514
32.
Zurück zum Zitat Yu B, Zhang Z, Shu X (2020) Joint extraction of entities and relations based on a novel decomposition strategy. In: Proceedings of ECAI Yu B, Zhang Z, Shu X (2020) Joint extraction of entities and relations based on a novel decomposition strategy. In: Proceedings of ECAI
33.
Zurück zum Zitat Liu J, Chen S, Wang B, Zhang J, Li N, Xu T (2020) Attention as relation: Learning supervised multi-head self-attention for relation extraction. In: Proceedings of the twenty-ninth international joint conference on artificial intelligence, pp. 1294–1298 Liu J, Chen S, Wang B, Zhang J, Li N, Xu T (2020) Attention as relation: Learning supervised multi-head self-attention for relation extraction. In: Proceedings of the twenty-ninth international joint conference on artificial intelligence, pp. 1294–1298
34.
Zurück zum Zitat Wei Z, Su J, Wang Y, Tian Y, Chang Y (2020) A novel cascade binary tagging framework for relational triple extraction. In: Proceedings of the 58th annual meeting of the association for computational linguistics, pp. 1476–1488 Wei Z, Su J, Wang Y, Tian Y, Chang Y (2020) A novel cascade binary tagging framework for relational triple extraction. In: Proceedings of the 58th annual meeting of the association for computational linguistics, pp. 1476–1488
35.
Zurück zum Zitat Ye H, Zhang N, Deng S, Chen M, Tan C, Huang F, Chen H (2021) Contrastive triple extraction with generative transformer. In: Proceedings of the AAAI conference on artificial intelligence Ye H, Zhang N, Deng S, Chen M, Tan C, Huang F, Chen H (2021) Contrastive triple extraction with generative transformer. In: Proceedings of the AAAI conference on artificial intelligence
37.
Zurück zum Zitat Devlin J, Chang MW, Lee K, Toutanova K (2018) Bert: pre-training of deep bidirectional transformers for language understanding, In: Proceedings of the 58th annual meeting of the association for computationa. l Linguistics. p. 1810.04805 Devlin J, Chang MW, Lee K, Toutanova K (2018) Bert: pre-training of deep bidirectional transformers for language understanding, In: Proceedings of the 58th annual meeting of the association for computationa. l Linguistics. p. 1810.04805
38.
Zurück zum Zitat Vaswani A, Shazeer N, Parmar N (2017) Attention is all you need. Adv Neural Inf Process Syst 30:5998–6008 Vaswani A, Shazeer N, Parmar N (2017) Attention is all you need. Adv Neural Inf Process Syst 30:5998–6008
39.
Zurück zum Zitat Lin M, Chen Q, Yan S (2013) Network in network. In: Processing of the 2th international conference on learning representations Lin M, Chen Q, Yan S (2013) Network in network. In: Processing of the 2th international conference on learning representations
40.
Zurück zum Zitat Lin Z, Feng M, Santos CN (2017) A structured self-attentive sentence embedding. In: Processing of the 5th international conference on learning representations Lin Z, Feng M, Santos CN (2017) A structured self-attentive sentence embedding. In: Processing of the 5th international conference on learning representations
41.
Zurück zum Zitat Riedel S, Yao L, McCallum A (2010) Modeling relations and their mentions without labeled text. In: joint European conference on machine learning and knowledge discovery in databases, pp. 148–163 Riedel S, Yao L, McCallum A (2010) Modeling relations and their mentions without labeled text. In: joint European conference on machine learning and knowledge discovery in databases, pp. 148–163
42.
Zurück zum Zitat Takanobu R, Zhang T, Liu J, Huang M (2019) A hierarchical framework for relation extraction with reinforcement learning. In: Proceedings of the AAAI conference on artificial intelligence, p. 7072–7079 Takanobu R, Zhang T, Liu J, Huang M (2019) A hierarchical framework for relation extraction with reinforcement learning. In: Proceedings of the AAAI conference on artificial intelligence, p. 7072–7079
43.
Zurück zum Zitat Fu TJ, Li PH, Ma WY (2019) Graphrel: modeling text as relational graphs for joint entity and relation extraction. In: Proceedings of the 57th annual meeting of the association for computational linguistics, p. 1409-1418 Fu TJ, Li PH, Ma WY (2019) Graphrel: modeling text as relational graphs for joint entity and relation extraction. In: Proceedings of the 57th annual meeting of the association for computational linguistics, p. 1409-1418
44.
Zurück zum Zitat Bai C, Pan L, Luo S, Wu Z (2020) Joint extraction of entities and relations by a novel end-to-end model with a double-pointer module. Neurocomputing 377:325–333CrossRef Bai C, Pan L, Luo S, Wu Z (2020) Joint extraction of entities and relations by a novel end-to-end model with a double-pointer module. Neurocomputing 377:325–333CrossRef
45.
Zurück zum Zitat Nayak T, Ng HT (2020) Effective modeling of encoder-decoder architecture for joint entity and relation extraction. In: Proceedings of the AAAI conference on artificial intelligence, p. 1911.09886 Nayak T, Ng HT (2020) Effective modeling of encoder-decoder architecture for joint entity and relation extraction. In: Proceedings of the AAAI conference on artificial intelligence, p. 1911.09886
46.
Zurück zum Zitat Hoffmann R, Zhang C, Ling X, Zettlemoyer L, Weld DS (2011) Knowledge-based weak supervision for information extraction of overlapping relations. In: Proceedings of the 49th annual meeting of the association for computational linguistics: human language technologies, pp. 541–550 Hoffmann R, Zhang C, Ling X, Zettlemoyer L, Weld DS (2011) Knowledge-based weak supervision for information extraction of overlapping relations. In: Proceedings of the 49th annual meeting of the association for computational linguistics: human language technologies, pp. 541–550
Metadaten
Titel
A joint extraction model of entities and relations based on relation decomposition
verfasst von
Chen Gao
Xuan Zhang
Hui Liu
Wei Yun
Jiahao Jiang
Publikationsdatum
24.01.2022
Verlag
Springer Berlin Heidelberg
Erschienen in
International Journal of Machine Learning and Cybernetics / Ausgabe 7/2022
Print ISSN: 1868-8071
Elektronische ISSN: 1868-808X
DOI
https://doi.org/10.1007/s13042-021-01491-6

Weitere Artikel der Ausgabe 7/2022

International Journal of Machine Learning and Cybernetics 7/2022 Zur Ausgabe

Neuer Inhalt