Skip to main content

2024 | OriginalPaper | Buchkapitel

A Comparative Study of Knowledge Graph-to-Text Generation Architectures in the Context of Conversational Agents

verfasst von : Hussam Ghanem, Christophe Cruz

Erschienen in: Complex Networks & Their Applications XII

Verlag: Springer Nature Switzerland

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

This work delves into the dynamic landscape of Knowledge Graph-to-text generation, where structured knowledge graphs are transformed into coherent natural language text. Three key architectural paradigms are explored: Graph Neural Networks (GNNs), Graph Transformers (GTs), and linearization with sequence-to-sequence models. We discuss the advantages and limitations of these architectures, and we do some experiments on these architectures. Performance evaluations on WebNLG V.2 demonstrate the superiority of sequence-to-sequence Transformer-based models, especially when enriched with structural information from the graph. Despite being unsupervised, the CycleGT model also outperforms GNNs and GTs. However, practical constraints, such as computational efficiency and model validity, make sequence-to-sequence models the preferred choice for real-time conversational agents. Future research directions include enhancing the efficiency of GNNs and GTs, addressing scalability issues, handling multimodal knowledge graphs, improving interpretability, and devising data labeling strategies for domain-specific models. Cross-lingual and multilingual extensions can further broaden the applicability of these models in diverse linguistic contexts. In conclusion, the choice of architecture should align with specific task requirements and application constraints, and the field offers promising prospects for continued innovation and refinement.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Balažević, I., Allen, C., Hospedales, T.M.: Tucker: tensor factorization for knowledge graph completion. arXiv preprint arXiv:1901.09590 (2019) Balažević, I., Allen, C., Hospedales, T.M.: Tucker: tensor factorization for knowledge graph completion. arXiv preprint arXiv:​1901.​09590 (2019)
2.
Zurück zum Zitat Banerjee, S., Lavie, A.: METEOR: an automatic metric for MT evaluation with improved correlation with human judgments. In: Proceedings of the ACL Workshop on Intrinsic and Extrinsic Evaluation Measures for Machine Translation and/or Summarization (2005) Banerjee, S., Lavie, A.: METEOR: an automatic metric for MT evaluation with improved correlation with human judgments. In: Proceedings of the ACL Workshop on Intrinsic and Extrinsic Evaluation Measures for Machine Translation and/or Summarization (2005)
3.
Zurück zum Zitat Bordes, A., et al.: Translating embeddings for modeling multi-relational data. In: Advances in Neural Information Processing Systems, vol. 26 (2013) Bordes, A., et al.: Translating embeddings for modeling multi-relational data. In: Advances in Neural Information Processing Systems, vol. 26 (2013)
4.
Zurück zum Zitat Cai, D., Lam, W.: Graph transformer for graph-to-sequence learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, no. 05 (2020) Cai, D., Lam, W.: Graph transformer for graph-to-sequence learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, no. 05 (2020)
5.
Zurück zum Zitat Chaudhri, V., et al.: Knowledge graphs: introduction, history and perspectives. AI Mag. 43(1), 17–29 (2022) Chaudhri, V., et al.: Knowledge graphs: introduction, history and perspectives. AI Mag. 43(1), 17–29 (2022)
6.
Zurück zum Zitat Chen, Y., Wu, L., Zaki, M.J.: Reinforcement learning based graph-to-sequence model for natural question generation. arXiv preprint arXiv:1908.04942 (2019) Chen, Y., Wu, L., Zaki, M.J.: Reinforcement learning based graph-to-sequence model for natural question generation. arXiv preprint arXiv:​1908.​04942 (2019)
7.
Zurück zum Zitat Chen, Y., Wu, L., Zaki, M.J.: Toward subgraph-guided knowledge graph question generation with graph neural networks. IEEE Trans. Neural Netw. Learn. Syst., 1–12 (2023) Chen, Y., Wu, L., Zaki, M.J.: Toward subgraph-guided knowledge graph question generation with graph neural networks. IEEE Trans. Neural Netw. Learn. Syst., 1–12 (2023)
8.
Zurück zum Zitat Chen, W., et al.: KGPT: knowledge-grounded pre-training for data-to-text generation. arXiv preprint arXiv:2010.02307 (2020) Chen, W., et al.: KGPT: knowledge-grounded pre-training for data-to-text generation. arXiv preprint arXiv:​2010.​02307 (2020)
9.
Zurück zum Zitat Dai, Y., et al.: A survey on knowledge graph embedding: approaches, applications and benchmarks. Electronics 9(5), 750 (2020)CrossRef Dai, Y., et al.: A survey on knowledge graph embedding: approaches, applications and benchmarks. Electronics 9(5), 750 (2020)CrossRef
10.
Zurück zum Zitat Distiawan, B., et al.: GTR-LSTM: a triple encoder for sentence generation from RDF data. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) (2018) Distiawan, B., et al.: GTR-LSTM: a triple encoder for sentence generation from RDF data. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) (2018)
11.
Zurück zum Zitat Duvenaud, D.K., et al.: Convolutional networks on graphs for learning molecular fingerprints. In: Advances in Neural Information Processing Systems, vol. 28 (2015) Duvenaud, D.K., et al.: Convolutional networks on graphs for learning molecular fingerprints. In: Advances in Neural Information Processing Systems, vol. 28 (2015)
12.
Zurück zum Zitat Ferreira, T.C., et al.: Enriching the WebNLG corpus. In: Proceedings of the 11th International Conference on Natural Language Generation (2018) Ferreira, T.C., et al.: Enriching the WebNLG corpus. In: Proceedings of the 11th International Conference on Natural Language Generation (2018)
13.
Zurück zum Zitat Ferreira, T.C., et al.: Neural data-to-text generation: a comparison between pipeline and end-to-end architectures. arXiv preprint arXiv:1908.09022 (2019) Ferreira, T.C., et al.: Neural data-to-text generation: a comparison between pipeline and end-to-end architectures. arXiv preprint arXiv:​1908.​09022 (2019)
14.
15.
Zurück zum Zitat Gardent, C., et al.: The WebNLG challenge: generating text from RDF data. In: Proceedings of the 10th International Conference on Natural Language Generation (2017) Gardent, C., et al.: The WebNLG challenge: generating text from RDF data. In: Proceedings of the 10th International Conference on Natural Language Generation (2017)
16.
Zurück zum Zitat Gilmer, J., et al.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning. PMLR (2017) Gilmer, J., et al.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning. PMLR (2017)
17.
Zurück zum Zitat Guo, Q., et al.: CycleGT: unsupervised graph-to-text and text-to-graph generation via cycle training. arXiv preprint arXiv:2006.04702 (2020) Guo, Q., et al.: CycleGT: unsupervised graph-to-text and text-to-graph generation via cycle training. arXiv preprint arXiv:​2006.​04702 (2020)
18.
Zurück zum Zitat Guo, Q., et al.: Fork or fail: cycle-consistent training with many-to-one mappings. In: International Conference on Artificial Intelligence and Statistics. PMLR (2021) Guo, Q., et al.: Fork or fail: cycle-consistent training with many-to-one mappings. In: International Conference on Artificial Intelligence and Statistics. PMLR (2021)
19.
Zurück zum Zitat Guo, Q., et al.: P2: a plan-and-pretrain approach for knowledge graph-to-text generation: a plan-and-pretrain approach for knowledge graph-to-text generation. In: Proceedings of the 3rd International Workshop on Natural Language Generation from the Semantic Web (WebNLG+) (2020) Guo, Q., et al.: P2: a plan-and-pretrain approach for knowledge graph-to-text generation: a plan-and-pretrain approach for knowledge graph-to-text generation. In: Proceedings of the 3rd International Workshop on Natural Language Generation from the Semantic Web (WebNLG+) (2020)
20.
Zurück zum Zitat Guo, Z., et al.: Densely connected graph convolutional networks for graph-to-sequence learning. Trans. Assoc. Comput. Linguist. 7, 297–312 (2019)CrossRef Guo, Z., et al.: Densely connected graph convolutional networks for graph-to-sequence learning. Trans. Assoc. Comput. Linguist. 7, 297–312 (2019)CrossRef
21.
Zurück zum Zitat Hogan, A., et al.: Knowledge graphs. ACM Comput. Surv. (Csur) 54(4), 1–37 (2021) Hogan, A., et al.: Knowledge graphs. ACM Comput. Surv. (Csur) 54(4), 1–37 (2021)
22.
Zurück zum Zitat Joshi, C.: Transformers are graph neural networks. The Gradient 7 (2020) Joshi, C.: Transformers are graph neural networks. The Gradient 7 (2020)
23.
Zurück zum Zitat Ke, P., et al.: JointGT: graph-text joint representation learning for text generation from knowledge graphs. arXiv preprint arXiv:2106.10502 (2021) Ke, P., et al.: JointGT: graph-text joint representation learning for text generation from knowledge graphs. arXiv preprint arXiv:​2106.​10502 (2021)
24.
Zurück zum Zitat Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907 (2016) Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:​1609.​02907 (2016)
25.
Zurück zum Zitat Koncel-Kedziorski, R., et al.: Text generation from knowledge graphs with graph transformers. arXiv preprint arXiv:1904.02342 (2019) Koncel-Kedziorski, R., et al.: Text generation from knowledge graphs with graph transformers. arXiv preprint arXiv:​1904.​02342 (2019)
26.
Zurück zum Zitat Lewis, M., et al.: Bart: denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension. arXiv preprint arXiv:1910.13461 (2019) Lewis, M., et al.: Bart: denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension. arXiv preprint arXiv:​1910.​13461 (2019)
30.
Zurück zum Zitat Lin, C.-Y.: Rouge: a package for automatic evaluation of summaries. Text summarization branches out (2004) Lin, C.-Y.: Rouge: a package for automatic evaluation of summaries. Text summarization branches out (2004)
31.
Zurück zum Zitat Liu, L., et al.: How to train your agent to read and write. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, no. 15 (2021) Liu, L., et al.: How to train your agent to read and write. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, no. 15 (2021)
32.
Zurück zum Zitat Liu, W., et al.: K-BERT: enabling language representation with knowledge graph. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, no. 03 (2020) Liu, W., et al.: K-BERT: enabling language representation with knowledge graph. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, no. 03 (2020)
33.
34.
Zurück zum Zitat Marcheggiani, D., Frolov, A., Titov, I.: A simple and accurate syntax-agnostic neural model for dependency-based semantic role labeling. arXiv preprint arXiv:1701.02593 (2017) Marcheggiani, D., Frolov, A., Titov, I.: A simple and accurate syntax-agnostic neural model for dependency-based semantic role labeling. arXiv preprint arXiv:​1701.​02593 (2017)
35.
Zurück zum Zitat Moon, S., et al.: OpenDialKG: explainable conversational reasoning with attention-based walks over knowledge graphs. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics (2019) Moon, S., et al.: OpenDialKG: explainable conversational reasoning with attention-based walks over knowledge graphs. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics (2019)
36.
Zurück zum Zitat Moryossef, A., Goldberg, Y., Dagan, I.: Step-by-step: separating planning from realization in neural data-to-text generation. arXiv preprint arXiv:1904.03396 (2019) Moryossef, A., Goldberg, Y., Dagan, I.: Step-by-step: separating planning from realization in neural data-to-text generation. arXiv preprint arXiv:​1904.​03396 (2019)
38.
Zurück zum Zitat Noy, N., et al.: Industry-scale Knowledge Graphs: Lessons and Challenges: Five diverse technology companies show how it’s done. Queue 17(2), 48–75 (2019)CrossRef Noy, N., et al.: Industry-scale Knowledge Graphs: Lessons and Challenges: Five diverse technology companies show how it’s done. Queue 17(2), 48–75 (2019)CrossRef
39.
Zurück zum Zitat Papineni, K., et al.: Bleu: a method for automatic evaluation of machine translation. In: Proceedings of the 40th Annual Meeting of the Association for Computational Linguistics (2002) Papineni, K., et al.: Bleu: a method for automatic evaluation of machine translation. In: Proceedings of the 40th Annual Meeting of the Association for Computational Linguistics (2002)
41.
Zurück zum Zitat Puduppully, R., Dong, L., Lapata, M.: Data-to-text generation with content selection and planning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 33. no. 01 (2019) Puduppully, R., Dong, L., Lapata, M.: Data-to-text generation with content selection and planning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 33. no. 01 (2019)
42.
Zurück zum Zitat Puzikov, Y., Gurevych, I.: E2E NLG challenge: neural models vs. templates. In: Proceedings of the 11th International Conference on Natural Language Generation (2018) Puzikov, Y., Gurevych, I.: E2E NLG challenge: neural models vs. templates. In: Proceedings of the 11th International Conference on Natural Language Generation (2018)
43.
Zurück zum Zitat Radford, A., et al.: Language models are unsupervised multitask learners. OpenAI blog 1(8), 9 (2019) Radford, A., et al.: Language models are unsupervised multitask learners. OpenAI blog 1(8), 9 (2019)
44.
Zurück zum Zitat Raffel, C., et al.: Exploring the limits of transfer learning with a unified text-to-text transformer. J. Mach. Learn. Res. 21(1), 5485–5551 (2020)MathSciNet Raffel, C., et al.: Exploring the limits of transfer learning with a unified text-to-text transformer. J. Mach. Learn. Res. 21(1), 5485–5551 (2020)MathSciNet
45.
Zurück zum Zitat Ribeiro, L.F.R., et al.: Investigating pretrained language models for graph-to-text generation. arXiv preprint arXiv:2007.08426 (2020) Ribeiro, L.F.R., et al.: Investigating pretrained language models for graph-to-text generation. arXiv preprint arXiv:​2007.​08426 (2020)
46.
Zurück zum Zitat Ribeiro, L.F.R., et al.: Modeling global and local node contexts for text generation from knowledge graphs. Trans. Assoc. Comput. Linguist. 8, 589–604 (2020)CrossRef Ribeiro, L.F.R., et al.: Modeling global and local node contexts for text generation from knowledge graphs. Trans. Assoc. Comput. Linguist. 8, 589–604 (2020)CrossRef
47.
Zurück zum Zitat Scarselli, F., et al.: The graph neural network model. IEEE Trans. Neural Netw. 20(1), 61–80 (2008) Scarselli, F., et al.: The graph neural network model. IEEE Trans. Neural Netw. 20(1), 61–80 (2008)
48.
Zurück zum Zitat Schmitt, M., et al.: Modeling graph structure via relative position for text generation from knowledge graphs. arXiv preprint arXiv:2006.09242 (2020) Schmitt, M., et al.: Modeling graph structure via relative position for text generation from knowledge graphs. arXiv preprint arXiv:​2006.​09242 (2020)
49.
Zurück zum Zitat Shimorina, A., Gardent, C.: Handling rare items in data-to-text generation. In: Proceedings of the 11th International Conference on Natural Language Generation (2018) Shimorina, A., Gardent, C.: Handling rare items in data-to-text generation. In: Proceedings of the 11th International Conference on Natural Language Generation (2018)
51.
Zurück zum Zitat Tai, K.S., Socher, R., Manning, C.D.: Improved semantic representations from tree-structured long short-term memory networks. arXiv preprint arXiv:1503.00075 (2015) Tai, K.S., Socher, R., Manning, C.D.: Improved semantic representations from tree-structured long short-term memory networks. arXiv preprint arXiv:​1503.​00075 (2015)
52.
Zurück zum Zitat Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, vol. 30 (2017) Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, vol. 30 (2017)
53.
Zurück zum Zitat Vedantam, R., Lawrence Zitnick, C., Parikh, D.: Cider: consensus-based image description evaluation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2015) Vedantam, R., Lawrence Zitnick, C., Parikh, D.: Cider: consensus-based image description evaluation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2015)
55.
Zurück zum Zitat Wahde, M., Virgolin, M.: Conversational agents: theory and applications. In: Handbook on Computer Learning and Intelligence: Volume 2: Deep Learning, Intelligent Control and Evolutionary Computation, pp. 497–544 (2022) Wahde, M., Virgolin, M.: Conversational agents: theory and applications. In: Handbook on Computer Learning and Intelligence: Volume 2: Deep Learning, Intelligent Control and Evolutionary Computation, pp. 497–544 (2022)
56.
Zurück zum Zitat Wang, Z., Zhang, J., Feng, J., Chen, Z.: Knowledge graph embedding by translating on hyperplanes. In: Proceedings of the 28th AAAI Conference on Artificial Intelligence, pp. 1112–1119. AAAI Press (2014) Wang, Z., Zhang, J., Feng, J., Chen, Z.: Knowledge graph embedding by translating on hyperplanes. In: Proceedings of the 28th AAAI Conference on Artificial Intelligence, pp. 1112–1119. AAAI Press (2014)
57.
58.
59.
Zurück zum Zitat Zhao, C., Walker, M., Chaturvedi, S.: Bridging the structural gap between encoding and decoding for data-to-text generation. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics (2020) Zhao, C., Walker, M., Chaturvedi, S.: Bridging the structural gap between encoding and decoding for data-to-text generation. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics (2020)
Metadaten
Titel
A Comparative Study of Knowledge Graph-to-Text Generation Architectures in the Context of Conversational Agents
verfasst von
Hussam Ghanem
Christophe Cruz
Copyright-Jahr
2024
DOI
https://doi.org/10.1007/978-3-031-53468-3_35

Premium Partner