Skip to main content

2022 | OriginalPaper | Buchkapitel

Multi-task Learning for Automatic Event-Centric Temporal Knowledge Graph Construction

verfasst von : Timotej Knez

Erschienen in: Research Challenges in Information Science

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

An important aspect of understanding written language is recognising and understanding events described in a document. Each event is usually associated with a specific time or time period when it occurred. Humans naturally understand the time of each event based on our common sense and the relations between the events, expressed in the documents. In our work we will explore and implement a system for automated extraction of temporal relations between the events in a document as well as of additional attributes like date, time, duration etc. for placing the events in time. Our system will use the extracted information to build a graph representing the events seen in a document. We will also combine the temporal knowledge over multiple documents to build a global knowledge base that will serve as a collection of common sense about the temporal aspect of common events, allowing the system to use the gathered knowledge about the events to derive information not explicitly expressed in the document.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Bethard, S., Derczynski, L., Savova, G., Pustejovsky, J., Verhagen, M.: Semeval-2015 task 6: clinical tempeval. In: Proceedings of the 9th International Workshop on Semantic Evaluation (SemEval 2015), pp. 806–814 (2015) Bethard, S., Derczynski, L., Savova, G., Pustejovsky, J., Verhagen, M.: Semeval-2015 task 6: clinical tempeval. In: Proceedings of the 9th International Workshop on Semantic Evaluation (SemEval 2015), pp. 806–814 (2015)
2.
Zurück zum Zitat Bethard, S., et al.: Semeval-2016 task 12: clinical tempeval. In: Proceedings of the 10th International Workshop on Semantic Evaluation (SemEval-2016), pp. 1052–1062 (2016) Bethard, S., et al.: Semeval-2016 task 12: clinical tempeval. In: Proceedings of the 10th International Workshop on Semantic Evaluation (SemEval-2016), pp. 1052–1062 (2016)
4.
Zurück zum Zitat Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of NAACL-HLT, pp. 4171–4186 (2019) Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of NAACL-HLT, pp. 4171–4186 (2019)
5.
Zurück zum Zitat Dligach, D., Miller, T., Lin, C., Bethard, S., Savova, G.: Neural temporal relation extraction. In: Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 2, Short Papers, pp. 746–751 (2017) Dligach, D., Miller, T., Lin, C., Bethard, S., Savova, G.: Neural temporal relation extraction. In: Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 2, Short Papers, pp. 746–751 (2017)
8.
Zurück zum Zitat Hobbs, J.R., Pan, F.: Time ontology in owl. W3C working draft, vol. 27, no. 133, pp. 3–36 (2006) Hobbs, J.R., Pan, F.: Time ontology in owl. W3C working draft, vol. 27, no. 133, pp. 3–36 (2006)
9.
Zurück zum Zitat Josifoski, M., De Cao, N., Peyrard, M., West, R.: GenIE: generative information extraction. arXiv preprint arXiv:2112.08340 (2021) Josifoski, M., De Cao, N., Peyrard, M., West, R.: GenIE: generative information extraction. arXiv preprint arXiv:​2112.​08340 (2021)
10.
Zurück zum Zitat Lin, C., Miller, T., Dligach, D., Bethard, S., Savova, G.: A BERT-based universal model for both within-and cross-sentence clinical temporal relation extraction. In: Proceedings of the 2nd Clinical Natural Language Processing Workshop, pp. 65–71 (2019) Lin, C., Miller, T., Dligach, D., Bethard, S., Savova, G.: A BERT-based universal model for both within-and cross-sentence clinical temporal relation extraction. In: Proceedings of the 2nd Clinical Natural Language Processing Workshop, pp. 65–71 (2019)
11.
Zurück zum Zitat Ning, Q., Subramanian, S., Roth, D.: An improved neural baseline for temporal relation extraction. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pp. 6203–6209 (2019) Ning, Q., Subramanian, S., Roth, D.: An improved neural baseline for temporal relation extraction. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pp. 6203–6209 (2019)
12.
Zurück zum Zitat Ning, Q., Wu, H., Peng, H., Roth, D.: Improving temporal relation extraction with a globally acquired statistical resource. In: Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers), pp. 841–851 (2018) Ning, Q., Wu, H., Peng, H., Roth, D.: Improving temporal relation extraction with a globally acquired statistical resource. In: Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers), pp. 841–851 (2018)
13.
Zurück zum Zitat Ning, Q., Wu, H., Roth, D.: A multi-axis annotation scheme for event temporal relations. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 1318–1328 (2018) Ning, Q., Wu, H., Roth, D.: A multi-axis annotation scheme for event temporal relations. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 1318–1328 (2018)
14.
Zurück zum Zitat Ro, Y., Lee, Y., Kang, P.: Multi2oie: multilingual open information extraction based on multi-head attention with BERT. In: Findings of ACL: EMNLP 2020 (2020) Ro, Y., Lee, Y., Kang, P.: Multi2oie: multilingual open information extraction based on multi-head attention with BERT. In: Findings of ACL: EMNLP 2020 (2020)
15.
Zurück zum Zitat Styler, W.F., et al.: Temporal annotation in the clinical domain. Trans. Assoc. Comput. Linguist. 2, 143–154 (2014)CrossRef Styler, W.F., et al.: Temporal annotation in the clinical domain. Trans. Assoc. Comput. Linguist. 2, 143–154 (2014)CrossRef
16.
Zurück zum Zitat Sun, W., Rumshisky, A., Uzuner, O.: Evaluating temporal relations in clinical text: 2012 i2b2 challenge. J. Am. Med. Inf. Assoc. 20(5), 806–813 (2013)CrossRef Sun, W., Rumshisky, A., Uzuner, O.: Evaluating temporal relations in clinical text: 2012 i2b2 challenge. J. Am. Med. Inf. Assoc. 20(5), 806–813 (2013)CrossRef
17.
Zurück zum Zitat Vashishtha, S., Van Durme, B., White, A.S.: Fine-grained temporal relation extraction. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 2906–2919 (2019) Vashishtha, S., Van Durme, B., White, A.S.: Fine-grained temporal relation extraction. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 2906–2919 (2019)
18.
Zurück zum Zitat Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems 30 (2017) Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems 30 (2017)
19.
Zurück zum Zitat Zhang, N., et al.: Contrastive information extraction with generative transformer. IEEE/ACM Trans. Audio Speech Lang. Process. 29, 3077–3088 (2021)CrossRef Zhang, N., et al.: Contrastive information extraction with generative transformer. IEEE/ACM Trans. Audio Speech Lang. Process. 29, 3077–3088 (2021)CrossRef
20.
Zurück zum Zitat Zhang, S., Huang, L., Ning, Q.: Extracting temporal event relation with syntactic-guided temporal graph transformer. arXiv preprint arXiv:2104.09570 (2021) Zhang, S., Huang, L., Ning, Q.: Extracting temporal event relation with syntactic-guided temporal graph transformer. arXiv preprint arXiv:​2104.​09570 (2021)
Metadaten
Titel
Multi-task Learning for Automatic Event-Centric Temporal Knowledge Graph Construction
verfasst von
Timotej Knez
Copyright-Jahr
2022
DOI
https://doi.org/10.1007/978-3-031-05760-1_59