Skip to main content
Erschienen in:

25.01.2023

Position-Aware Attention Mechanism–Based Bi-graph for Dialogue Relation Extraction

verfasst von: Guiduo Duan, Yunrui Dong, Jiayu Miao, Tianxi Huang

Erschienen in: Cognitive Computation | Ausgabe 1/2023

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Relation extraction in a dialogue scenario aims to extract the relations between entities in a multi-turn dialogue. Unlike the conventional relation extraction task, the dialogue relation cannot yield a result through a single sentence. Therefore, it is essential to model multi-turn dialogue for reasoning. However, dialogue relation extraction easily causes referential ambiguity owing to the low information density in the dialogue dataset and a large amount of pronoun referential information in the dialogue. In addition, most existing models only consider the token-level information interaction and do not fully utilize the interaction between discourses. To address these issues, a graph neural network–based dialogue relation extraction model is proposed using the position-aware refinement mechanism (PAR-DRE) in this paper. Firstly, PAR-DRE models the dependencies between the speaker’s relevant information and various discourse sentences and introduces pronoun reference information to develop the dialogue into a heterogeneous reference dialogue graph. Secondly, a position-aware refinement mechanism is introduced to capture more discriminative features of nodes containing relative location information. On this basis, an entity graph is built by merging the abovementioned nodes, and the path reasoning mechanism is used to infer the relation between entities in the dialogue. The experimental results on the dialogue dataset indicate that the performance of the F1 value of this method is enhanced by 1.25% compared with the current mainstream approaches.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Nayak T, Majumder N, Goyal P, Poria S. Deep neural approaches to relation triplets extraction: a comprehensive survey. Cogn Comput. 2021;13(5):1215–32.CrossRef Nayak T, Majumder N, Goyal P, Poria S. Deep neural approaches to relation triplets extraction: a comprehensive survey. Cogn Comput. 2021;13(5):1215–32.CrossRef
2.
Zurück zum Zitat Bosselut A, Rashkin H, Sap M, Malaviya C, et al. Comet: commonsense transformers for automatic knowledge graph construction, Proceedings of the 57th Conference of the Association for Computational Linguistics, Florence, Italy Long Papers. 2019;1:4762–4779. Bosselut A, Rashkin H, Sap M, Malaviya C, et al. Comet: commonsense transformers for automatic knowledge graph construction, Proceedings of the 57th Conference of the Association for Computational Linguistics, Florence, Italy Long Papers. 2019;1:4762–4779.
3.
Zurück zum Zitat Yao X, Van Durme B. Information extraction over structured data: question answering with freebase, Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics, Baltimore, Maryland Long Papers. 2014;1:956–966. Yao X, Van Durme B. Information extraction over structured data: question answering with freebase, Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics, Baltimore, Maryland Long Papers. 2014;1:956–966.
4.
Zurück zum Zitat Zhang Y, Qi P, Manning CD. Graph convolution over pruned dependency trees improves relation extraction, Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, Brussels, Belgium. 2018;2205–2215. Zhang Y, Qi P, Manning CD. Graph convolution over pruned dependency trees improves relation extraction, Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, Brussels, Belgium. 2018;2205–2215.
5.
Zurück zum Zitat Zheng S, Wang F, Bao H, Hao Y, Zhou P, Xu B. Joint extraction of entities and relations based on a novel tagging scheme, Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, Vancouver, Canada, Long Papers. 2017;1:1227–1236. Zheng S, Wang F, Bao H, Hao Y, Zhou P, Xu B. Joint extraction of entities and relations based on a novel tagging scheme, Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, Vancouver, Canada, Long Papers. 2017;1:1227–1236.
6.
Zurück zum Zitat Gou Y, Lei Y, Liu L, Zhang P, et al. A dynamic parameter enhanced network for distant supervised relation extraction. Knowl-Based Syst. 2020;197:105912.CrossRef Gou Y, Lei Y, Liu L, Zhang P, et al. A dynamic parameter enhanced network for distant supervised relation extraction. Knowl-Based Syst. 2020;197:105912.CrossRef
7.
Zurück zum Zitat Lei M, Huang H, Feng C, Gao Y, et al. An input information enhanced model for relation extraction. Neural Comput Appl. 2019;31:9113–26.CrossRef Lei M, Huang H, Feng C, Gao Y, et al. An input information enhanced model for relation extraction. Neural Comput Appl. 2019;31:9113–26.CrossRef
8.
Zurück zum Zitat Lyu S, Cheng J, Wu X, Cui L, et al. Auxiliary learning for relation extraction. IEEE Trans Emerg Topics Comput Intell. 2020;6(1):182–91.CrossRef Lyu S, Cheng J, Wu X, Cui L, et al. Auxiliary learning for relation extraction. IEEE Trans Emerg Topics Comput Intell. 2020;6(1):182–91.CrossRef
9.
Zurück zum Zitat Li Y, Su H, Shen X, Li W, et al. Dailydialog: A manually labelled multi-turn dialogue dataset, Proceedings of the Eighth International Joint Conference on Natural Language Processing, Hongkong, China, Long Papers. 2017;1:986–995. Li Y, Su H, Shen X, Li W, et al. Dailydialog: A manually labelled multi-turn dialogue dataset, Proceedings of the Eighth International Joint Conference on Natural Language Processing, Hongkong, China, Long Papers. 2017;1:986–995.
10.
Zurück zum Zitat Dinan E, Roller S, Shuster K, Fan A, et al. Wizard of Wikipedia: Knowledge-powered conversational agents, in 7th International Conference on Learning Representations, New Orleans, USA. 2019. https://arxiv.org/abs/1811.01241. Dinan E, Roller S, Shuster K, Fan A, et al. Wizard of Wikipedia: Knowledge-powered conversational agents, in 7th International Conference on Learning Representations, New Orleans, USA. 2019. https://​arxiv.​org/​abs/​1811.​01241.
11.
Zurück zum Zitat Serban I, Sordoni A, Bengio Y, Courville A, et al. Building end-to-end dialogue systems using generative hierarchical neural network models, In Proceedings of the 30th AAAI Conference on Artificial Intelligence. 2016;30(1):3776–3783. Serban I, Sordoni A, Bengio Y, Courville A, et al. Building end-to-end dialogue systems using generative hierarchical neural network models, In Proceedings of the 30th AAAI Conference on Artificial Intelligence. 2016;30(1):3776–3783.
12.
Zurück zum Zitat Yang L, Qiu M, Qu C, Guo J, et al. Response ranking with deep matching networks and external knowledge in information-seeking conversation systems, The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval, Ann Arbor, MI, USA. 2018;245–254. Yang L, Qiu M, Qu C, Guo J, et al. Response ranking with deep matching networks and external knowledge in information-seeking conversation systems, The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval, Ann Arbor, MI, USA. 2018;245–254.
13.
Zurück zum Zitat Yu D, Sun K, Cardie C, Yu D. Dialogue-based relation extraction, Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Online. 2020;4927–4940. Yu D, Sun K, Cardie C, Yu D. Dialogue-based relation extraction, Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Online. 2020;4927–4940.
14.
Zurück zum Zitat Wang D, Liu Y. A pilot study of opinion summarization in conversations, Proceedings of the 49th annual meeting of the Association for Computational Linguistics: Human language technologies, Portland, USA. 2011;331–339. Wang D, Liu Y. A pilot study of opinion summarization in conversations, Proceedings of the 49th annual meeting of the Association for Computational Linguistics: Human language technologies, Portland, USA. 2011;331–339.
15.
Zurück zum Zitat Biber D. Variation across speech and writing, Cambridge University Press. 1991. Biber D. Variation across speech and writing, Cambridge University Press. 1991.
17.
Zurück zum Zitat Devlin J, Chang MW, Lee K, et al. Bert: Pre-training of deep bidirectional transformers for language understanding, Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers). 2019;4171–4186. Devlin J, Chang MW, Lee K, et al. Bert: Pre-training of deep bidirectional transformers for language understanding, Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers). 2019;4171–4186.
18.
Zurück zum Zitat Xue F, Sun A, Zhang H, Ni J, et al. An embarrassingly simple model for dialogue relation extraction, ICASSP 2022–2022 IEEE International Conference on Acoustics, Speech and Signal Processing. 2022;6707–6711. Xue F, Sun A, Zhang H, Ni J, et al. An embarrassingly simple model for dialogue relation extraction, ICASSP 2022–2022 IEEE International Conference on Acoustics, Speech and Signal Processing. 2022;6707–6711.
19.
Zurück zum Zitat Nan G, Guo Z, Sekulić I, Lu W. Reasoning with latent structure refinement for document-level relation extraction, Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Online. 2020;1546–1557. Nan G, Guo Z, Sekulić I, Lu W. Reasoning with latent structure refinement for document-level relation extraction, Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Online. 2020;1546–1557.
20.
Zurück zum Zitat He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition, Proceedings of the IEEE conference on computer vision and pattern recognition. 2016;770–778. He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition, Proceedings of the IEEE conference on computer vision and pattern recognition. 2016;770–778.
21.
Zurück zum Zitat Ying R, He R, Chen K, Eksombatchai P, et al. Graph convolutional neural networks for web-scale recommender systems, Proceedings of the 24th ACM SIGKDD international conference on knowledge discovery & data mining, London, United Kingdom. 2018;974–983. Ying R, He R, Chen K, Eksombatchai P, et al. Graph convolutional neural networks for web-scale recommender systems, Proceedings of the 24th ACM SIGKDD international conference on knowledge discovery & data mining, London, United Kingdom. 2018;974–983.
22.
Zurück zum Zitat Hamaguchi T, Oiwa H, Shimbo M, Matsumoto Y. Knowledge transfer for out-of-knowledge-base entities: a graph neural network approach, Proceedings of the 26th International Joint Conference on Artificial Intelligence, Melbourne, Australia. 2017;1802–1808. Hamaguchi T, Oiwa H, Shimbo M, Matsumoto Y. Knowledge transfer for out-of-knowledge-base entities: a graph neural network approach, Proceedings of the 26th International Joint Conference on Artificial Intelligence, Melbourne, Australia. 2017;1802–1808.
23.
Zurück zum Zitat Peng N, Poon H, Quirk C, Toutanova K, et al. Cross-sentence N-ary relation extraction with graph LSTMs, Transactions of the Association for. Comput Linguist. 2017;5:101–15. Peng N, Poon H, Quirk C, Toutanova K, et al. Cross-sentence N-ary relation extraction with graph LSTMs, Transactions of the Association for. Comput Linguist. 2017;5:101–15.
24.
Zurück zum Zitat Liu Y, Lapata M. Learning structured text representations, Transactions of the Association for. Comput Linguist. 2018;6:63–75. Liu Y, Lapata M. Learning structured text representations, Transactions of the Association for. Comput Linguist. 2018;6:63–75.
25.
Zurück zum Zitat Jia R, Wong C, Poon H. Document-level n-ary relation extraction with multiscale representation learning, Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Minneapolis, USA, (Long and Short Papers). 2019;1:3693–3704. Jia R, Wong C, Poon H. Document-level n-ary relation extraction with multiscale representation learning, Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Minneapolis, USA, (Long and Short Papers). 2019;1:3693–3704.
26.
Zurück zum Zitat Zeng S, Xu R, Chang B, Li L. Double graph-based reasoning for document-level relation extraction, Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, Online. 2020;1630–1640. Zeng S, Xu R, Chang B, Li L. Double graph-based reasoning for document-level relation extraction, Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, Online. 2020;1630–1640.
27.
Zurück zum Zitat Xue F, Sun A, Zhang H, Chng ES. Gdpnet: Refining latent multi-view graph for relation extraction, in 35th AAAI Conference on Artificial Intelligence. 2021;35(16):14194–14202. Xue F, Sun A, Zhang H, Chng ES. Gdpnet: Refining latent multi-view graph for relation extraction, in 35th AAAI Conference on Artificial Intelligence. 2021;35(16):14194–14202.
28.
Zurück zum Zitat Zhao L, Xu W, Gao S, Guo J. Utilizing graph neural networks to improving dialogue-based relation extraction. Neurocomputing. 2021;456:299–311.CrossRef Zhao L, Xu W, Gao S, Guo J. Utilizing graph neural networks to improving dialogue-based relation extraction. Neurocomputing. 2021;456:299–311.CrossRef
29.
Zurück zum Zitat Zhou M, Ji D, Li F. Relation extraction in dialogues: A deep learning model based on the generality and specialty of dialogue text. IEEE/ACM Trans Audio, Speech, and Lang Process. 2021;29:2015–26.CrossRef Zhou M, Ji D, Li F. Relation extraction in dialogues: A deep learning model based on the generality and specialty of dialogue text. IEEE/ACM Trans Audio, Speech, and Lang Process. 2021;29:2015–26.CrossRef
30.
Zurück zum Zitat Bai X, Chen Y, Song L, Zhang Y. Semantic representation for dialogue modeling, Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, ACL/IJCNLP 2021, (Long Papers). 2021;1:4430–4445. Bai X, Chen Y, Song L, Zhang Y. Semantic representation for dialogue modeling, Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, ACL/IJCNLP 2021, (Long Papers). 2021;1:4430–4445.
31.
Zurück zum Zitat Zhao T, Yan Z, Cao Y, Li Z. Enhancing dialogue-based relation extraction by speaker and trigger words prediction. Findings of the Association for Computational Linguistics, Online Event. 2021;4580–4585. Zhao T, Yan Z, Cao Y, Li Z. Enhancing dialogue-based relation extraction by speaker and trigger words prediction. Findings of the Association for Computational Linguistics, Online Event. 2021;4580–4585.
32.
Zurück zum Zitat Dai Z, Yang Z, Yang Y, Carbonell JG, et al. Transformer-xl: attentive language models beyond a fixed-length context. Proceedings of the 57th Conference of the Association for Computational Linguistics, Florence, Italy, Long Papers. 2019;1:2978–2988. Dai Z, Yang Z, Yang Y, Carbonell JG, et al. Transformer-xl: attentive language models beyond a fixed-length context. Proceedings of the 57th Conference of the Association for Computational Linguistics, Florence, Italy, Long Papers. 2019;1:2978–2988.
33.
Zurück zum Zitat Graves A, Schmidhuber J. Framewise phoneme classification with bidirectional LSTM and other neural network architectures. Neural Netw. 2005;18(5–6):602–10.CrossRef Graves A, Schmidhuber J. Framewise phoneme classification with bidirectional LSTM and other neural network architectures. Neural Netw. 2005;18(5–6):602–10.CrossRef
Metadaten
Titel
Position-Aware Attention Mechanism–Based Bi-graph for Dialogue Relation Extraction
verfasst von
Guiduo Duan
Yunrui Dong
Jiayu Miao
Tianxi Huang
Publikationsdatum
25.01.2023
Verlag
Springer US
Erschienen in
Cognitive Computation / Ausgabe 1/2023
Print ISSN: 1866-9956
Elektronische ISSN: 1866-9964
DOI
https://doi.org/10.1007/s12559-022-10105-4