Skip to main content

2021 | OriginalPaper | Buchkapitel

Multi-level Cohesion Information Modeling for Better Written and Dialogue Discourse Parsing

verfasst von : Jinfeng Wang, Longyin Zhang, Fang Kong

Erschienen in: Natural Language Processing and Chinese Computing

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Discourse parsing has attracted more and more attention due to its importance on Natural Language Understanding. Accordingly, various neural models proposed and have achieved certain success. However, due to the scale limitation of corpus, outstanding performance still depends on additional features. Different from previous neural studies employing simple flat word level EDU (Elementary Discourse Unit) representation, we improve the performance of discourse parsing by employing cohesion information (In this paper, we regard lexical chain and coreference chain as cohesion information) enhanced EDU representation. In particular, firstly we use WordNet and a coreference resolution model to extract lexical and coreference chain respectively and automatically. Secondly, we construct EDU level graph based on the extracted chains. Finally, using Graph Attention Network, we incorporate the obtained cohesion information into EDU representation to improve discourse parsing. Experiments on RST-DT, CDTB and STAC show our proposed cohesion information enhanced EDU representation can benefit both written and dialogue discourse parsing, compared with the baseline model we duplicated.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Fußnoten
1
In the dialogue text, each utterance corresponds to an EDU.
 
2
The split position between any two neighboring EDUs is called the split point.
 
3
There will be \(n-2\) split points for n EDUs.
 
4
A word similarity calculation method provided by WordNet, return a score between 0 and 1, denoting how similar two word senses are, based on the shortest path that connects the senses. Moreover, when there is no path between two senses, −1 will be returned.
 
5
we use the same method to build lexical and coreference graph.
 
6
In order to simplify the expression, the word and mention in lexical and coreference chain are collectively referred to as element.
 
7
Following previous study, we used the version released on March 21, 2018.
 
Literatur
1.
Zurück zum Zitat Afantenos, S., Kow, E., Asher, N., Perret, J.: Discourse parsing for multi-party chat dialogues. Association for Computational Linguistics (ACL) (2015) Afantenos, S., Kow, E., Asher, N., Perret, J.: Discourse parsing for multi-party chat dialogues. Association for Computational Linguistics (ACL) (2015)
2.
Zurück zum Zitat Asher, N., Hunter, J., Morey, M., Benamara, F., Afantenos, S.: Discourse structure and dialogue acts in multiparty dialogue: the STAC corpus (2016) Asher, N., Hunter, J., Morey, M., Benamara, F., Afantenos, S.: Discourse structure and dialogue acts in multiparty dialogue: the STAC corpus (2016)
3.
Zurück zum Zitat Asher, N., Lascarides, A.: Logics of Conversation. Peking University Press (2003) Asher, N., Lascarides, A.: Logics of Conversation. Peking University Press (2003)
4.
Zurück zum Zitat Carlson, L., Marcu, D., Okurowski, M.E.: Building a discourse-tagged corpus in the framework of rhetorical structure theory. Association for Computational Linguistics (2001) Carlson, L., Marcu, D., Okurowski, M.E.: Building a discourse-tagged corpus in the framework of rhetorical structure theory. Association for Computational Linguistics (2001)
6.
Zurück zum Zitat Dozat, T., Manning, C.D.: Deep biaffine attention for neural dependency parsing (2016) Dozat, T., Manning, C.D.: Deep biaffine attention for neural dependency parsing (2016)
7.
Zurück zum Zitat Fang, K., Fu, J.: Incorporating structural information for better coreference resolution. In: Twenty-Eighth International Joint Conference on Artificial Intelligence IJCAI-19 (2019) Fang, K., Fu, J.: Incorporating structural information for better coreference resolution. In: Twenty-Eighth International Joint Conference on Artificial Intelligence IJCAI-19 (2019)
8.
Zurück zum Zitat Feng, V.W., Hirst, G.: A linear-time bottom-up discourse parser with constraints and post-editing. In: Meeting of the Association for Computational Linguistics (2014) Feng, V.W., Hirst, G.: A linear-time bottom-up discourse parser with constraints and post-editing. In: Meeting of the Association for Computational Linguistics (2014)
9.
Zurück zum Zitat Ji, Y., Eisenstein, J.: Representation learning for text-level discourse parsing. In: Meeting of the Association for Computational Linguistics (2014) Ji, Y., Eisenstein, J.: Representation learning for text-level discourse parsing. In: Meeting of the Association for Computational Linguistics (2014)
11.
Zurück zum Zitat Kobayashi, N., Hirao, T., Kamigaito, H., Okumura, M., Nagata, M.: Top-down RST parsing utilizing granularity levels in documents. In: Proceedings of the AAAI Conference on Artificial Intelligence (2020) Kobayashi, N., Hirao, T., Kamigaito, H., Okumura, M., Nagata, M.: Top-down RST parsing utilizing granularity levels in documents. In: Proceedings of the AAAI Conference on Artificial Intelligence (2020)
12.
Zurück zum Zitat Li, J., Li, R., Hovy, E.: Recursive deep models for discourse parsing. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 2061–2069 (2014) Li, J., Li, R., Hovy, E.: Recursive deep models for discourse parsing. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 2061–2069 (2014)
13.
Zurück zum Zitat Li, Q., Li, T., Chang, B.: Discourse parsing with attention-based hierarchical neural networks. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 362–371 (2016) Li, Q., Li, T., Chang, B.: Discourse parsing with attention-based hierarchical neural networks. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 362–371 (2016)
14.
Zurück zum Zitat Li, Y., Feng, W., Jing, S., Fang, K., Zhou, G.: Building Chinese discourse corpus with connective-driven dependency tree structure. In: Conference on Empirical Methods in Natural Language Processing (2014) Li, Y., Feng, W., Jing, S., Fang, K., Zhou, G.: Building Chinese discourse corpus with connective-driven dependency tree structure. In: Conference on Empirical Methods in Natural Language Processing (2014)
15.
Zurück zum Zitat Morris, J., Hirst, G.: Lexical cohesion computed by thesaural relations as an indicator of the structure of text. Comput. Linguistics (1991) Morris, J., Hirst, G.: Lexical cohesion computed by thesaural relations as an indicator of the structure of text. Comput. Linguistics (1991)
16.
Zurück zum Zitat Nan, Y., Zhang, M., Fu, G.: Transition-based neural RST parsing with implicit syntax features (2018) Nan, Y., Zhang, M., Fu, G.: Transition-based neural RST parsing with implicit syntax features (2018)
17.
Zurück zum Zitat Perret, J., Afantenos, S., Asher, N., Morey, M.: Integer linear programming for discourse parsing. In: Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL-HLT 2016), pp. 99–109 (2016) Perret, J., Afantenos, S., Asher, N., Morey, M.: Integer linear programming for discourse parsing. In: Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL-HLT 2016), pp. 99–109 (2016)
18.
Zurück zum Zitat Shi, Z., Huang, M.: A deep sequential model for discourse parsing on multi-party dialogues (2018) Shi, Z., Huang, M.: A deep sequential model for discourse parsing on multi-party dialogues (2018)
19.
Zurück zum Zitat Sun, C., Fang, K.: A transition-based framework for Chinese discourse structure parsing. J. Chin. Inf. Process. (2018) Sun, C., Fang, K.: A transition-based framework for Chinese discourse structure parsing. J. Chin. Inf. Process. (2018)
20.
Zurück zum Zitat Takanobu, R., Huang, M., Zhao, Z., Li, F., Nie, L.: A weakly supervised method for topic segmentation and labeling in goal-oriented dialogues via reinforcement learning. In: Twenty-Seventh International Joint Conference on Artificial Intelligence IJCAI-18 (2018) Takanobu, R., Huang, M., Zhao, Z., Li, F., Nie, L.: A weakly supervised method for topic segmentation and labeling in goal-oriented dialogues via reinforcement learning. In: Twenty-Seventh International Joint Conference on Artificial Intelligence IJCAI-18 (2018)
21.
Zurück zum Zitat Velikovi, P., Cucurull, G., Casanova, A., Romero, A., Liò, P., Bengio, Y.: Graph attention networks (2017) Velikovi, P., Cucurull, G., Casanova, A., Romero, A., Liò, P., Bengio, Y.: Graph attention networks (2017)
22.
Zurück zum Zitat Xu, J., Gan, Z., Cheng, Y., Liu, J.: Discourse-aware neural extractive model for text summarization (2019) Xu, J., Gan, Z., Cheng, Y., Liu, J.: Discourse-aware neural extractive model for text summarization (2019)
23.
Zurück zum Zitat Zhang, L., Xing, Y., Kong, F., Li, P., Zhou, G.: A top-down neural architecture towards text-level parsing of discourse rhetorical structure. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics (2020) Zhang, L., Xing, Y., Kong, F., Li, P., Zhou, G.: A top-down neural architecture towards text-level parsing of discourse rhetorical structure. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics (2020)
24.
Zurück zum Zitat Zhang, L., Xing, Y., Kong, F., Li, P., Zhou, G.: A top-down neural architecture towards text-level parsing of discourse rhetorical structure. arXiv preprint arXiv:2005.02680 (2020) Zhang, L., Xing, Y., Kong, F., Li, P., Zhou, G.: A top-down neural architecture towards text-level parsing of discourse rhetorical structure. arXiv preprint arXiv:​2005.​02680 (2020)
25.
Zurück zum Zitat Zhu, X., Runcong, M.A., Sun, L., Chen, H.: Word semantic similarity computation based on HowNet and CiLin. J. Chin. Inf. Process. (2016) Zhu, X., Runcong, M.A., Sun, L., Chen, H.: Word semantic similarity computation based on HowNet and CiLin. J. Chin. Inf. Process. (2016)
Metadaten
Titel
Multi-level Cohesion Information Modeling for Better Written and Dialogue Discourse Parsing
verfasst von
Jinfeng Wang
Longyin Zhang
Fang Kong
Copyright-Jahr
2021
DOI
https://doi.org/10.1007/978-3-030-88480-2_4