Skip to main content

2021 | OriginalPaper | Buchkapitel

ReMERT: Relational Memory-Based Extraction for Relational Triples

verfasst von : Chongshuai Zhao, Xudong Dai, Lin Feng, Peng Liu

Erschienen in: Natural Language Processing and Chinese Computing

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Relational triples extraction aims to detect entity pairs (subjects, objects) along with their relations. Previous work failed to deal with complex relationship triples, such as overlapping triples and nested entities, and lacked semantic representation in the process of extracting entity pairs and relationships. To mitigate these issues, we propose a joint extraction model called ReMERT, which first decomposes the joint extraction task into three interrelated subtasks, namely RSE (Relation-specific Subject Extraction), RM (Relational Memory) module construction and OE (Object Extraction). The first subtask is to distinguish all subjects that may be involved with target relations, the second is to retrieve target relational representation from RM module, and the last is to identify corresponding objects for each specific (s, r) pair. Additionally, RSE and OE subtasks are further deconstructed into sequence labeling problems based on the proposed hierarchical binary tagging scheme. Owing to the reasonable decomposition strategy, the proposed model can fully capture the semantic interdependency between different subtasks, as well as reduce noise from irrelevant entity pairs. Experimental results show that the proposed method outperforms previous work by 0.8% (F1 score), achieving a new state-of-the-art on Chinese DuIE datasets. We also adopt sufficient experiments and obtain promising results both in public English NYT and Chinese DuIE datasets.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Zhou, G., Su, J., Zhang, J., Zhang, M.: Exploring various knowledge in relation extraction. In: ACL, Meeting of the Association for Computational Linguistics, Conference, June, University of Michigan, USA (2002) Zhou, G., Su, J., Zhang, J., Zhang, M.: Exploring various knowledge in relation extraction. In: ACL, Meeting of the Association for Computational Linguistics, Conference, June, University of Michigan, USA (2002)
2.
Zurück zum Zitat Chan, Y.S., Roth, D.: Exploiting syntactico-semantic structures for relation extraction. In: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies, pp. 551–560 (2011) Chan, Y.S., Roth, D.: Exploiting syntactico-semantic structures for relation extraction. In: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies, pp. 551–560 (2011)
3.
Zurück zum Zitat Zheng, S., Wang, F., Bao, H., Hao, Y., Zhou, P., Xu, B.: Joint extraction of entities and relations based on a novel tagging scheme. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 1227–1236. Association for Computational Linguistics, Vancouver, July 2017 Zheng, S., Wang, F., Bao, H., Hao, Y., Zhou, P., Xu, B.: Joint extraction of entities and relations based on a novel tagging scheme. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 1227–1236. Association for Computational Linguistics, Vancouver, July 2017
4.
Zurück zum Zitat Miwa, M., Bansal, M.: End-to-end relation extraction using LSTMs on sequences and tree structures (2016) Miwa, M., Bansal, M.: End-to-end relation extraction using LSTMs on sequences and tree structures (2016)
5.
Zurück zum Zitat Zeng, X., Zeng, D., He, S., Kang, L., Zhao, J.: Extracting relational facts by an end-to-end neural model with copy mechanism. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) (2018) Zeng, X., Zeng, D., He, S., Kang, L., Zhao, J.: Extracting relational facts by an end-to-end neural model with copy mechanism. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) (2018)
6.
Zurück zum Zitat Wei, Z., Su, J., Wang, Y., Tian, Y., Chang, Y.: A novel cascade binary tagging framework for relational triple extraction. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 1476–1488. Association for Computational Linguistics, July 2020 Wei, Z., Su, J., Wang, Y., Tian, Y., Chang, Y.: A novel cascade binary tagging framework for relational triple extraction. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 1476–1488. Association for Computational Linguistics, July 2020
7.
Zurück zum Zitat Wang, Y., Yu, B., Zhang, Y., Liu, T., Zhu, H., Sun, L.: TPLinker: single-stage joint extraction of entities and relations through token pair linking. arXiv preprint arXiv:2010.13415, 2020 Wang, Y., Yu, B., Zhang, Y., Liu, T., Zhu, H., Sun, L.: TPLinker: single-stage joint extraction of entities and relations through token pair linking. arXiv preprint arXiv:​2010.​13415, 2020
8.
Zurück zum Zitat Li, C., Tian, Y.: Downstream model design of pre-trained language model for relation extraction task (2020) Li, C., Tian, Y.: Downstream model design of pre-trained language model for relation extraction task (2020)
9.
Zurück zum Zitat Li, Q., Ji, H.: Incremental joint extraction of entity mentions and relations. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 402–412 (2014) Li, Q., Ji, H.: Incremental joint extraction of entity mentions and relations. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 402–412 (2014)
10.
Zurück zum Zitat Fu, T.-J., Li, P.-H., Ma, W.-Y.: GraphRel: modeling text as relational graphs for joint entity and relation extraction. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 1409–1418. Association for Computational Linguistics, Florence, July 2019 Fu, T.-J., Li, P.-H., Ma, W.-Y.: GraphRel: modeling text as relational graphs for joint entity and relation extraction. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 1409–1418. Association for Computational Linguistics, Florence, July 2019
11.
Zurück zum Zitat Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: Pre-training of deep bidirectional transformers for language understanding, BERT (2018) Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: Pre-training of deep bidirectional transformers for language understanding, BERT (2018)
12.
Zurück zum Zitat Vaswani, A.:. Attention is all you need. In: Advances in Neural Information Processing Systems, vol. 30, pp. 5998–6008 (2017) Vaswani, A.:. Attention is all you need. In: Advances in Neural Information Processing Systems, vol. 30, pp. 5998–6008 (2017)
13.
Zurück zum Zitat Kingma, D., Ba, J.: Adam: a method for stochastic optimization. Comput. Sci. (2014) Kingma, D., Ba, J.: Adam: a method for stochastic optimization. Comput. Sci. (2014)
14.
Zurück zum Zitat Riedel, S., Yao, L., Mccallum, A.K.: Modeling relations and their mentions without labeled text. In: Machine Learning and Knowledge Discovery in Databases, European Conference, ECML PKDD, Barcelona, Spain, 20–24 September 2010, Proceedings. Part II I, 2010 (2010) Riedel, S., Yao, L., Mccallum, A.K.: Modeling relations and their mentions without labeled text. In: Machine Learning and Knowledge Discovery in Databases, European Conference, ECML PKDD, Barcelona, Spain, 20–24 September 2010, Proceedings. Part II I, 2010 (2010)
15.
Zurück zum Zitat Luo, X., Liu, W., Ma, M., Wang, P.: A bidirectional tree tagging scheme for jointly extracting overlapping entities and relations (2020) Luo, X., Liu, W., Ma, M., Wang, P.: A bidirectional tree tagging scheme for jointly extracting overlapping entities and relations (2020)
Metadaten
Titel
ReMERT: Relational Memory-Based Extraction for Relational Triples
verfasst von
Chongshuai Zhao
Xudong Dai
Lin Feng
Peng Liu
Copyright-Jahr
2021
DOI
https://doi.org/10.1007/978-3-030-88480-2_24