Skip to main content

2021 | OriginalPaper | Buchkapitel

Enhancing Long-Distance Dialogue History Modeling for Better Dialogue Ellipsis and Coreference Resolution

verfasst von : Zixin Ni, Fang Kong

Erschienen in: Natural Language Processing and Chinese Computing

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Previous work on dialogue-specific ellipsis and coreference resolution usually concatenates all dialogue history utterances into a single sequence. It may mislead the model to attend to inappropriate parts and to copy from wrong utterances when the dialogue history is long. In this paper, we aim to model dialogue history from multiple granularities and take a deep look into the semantic connection between the dialogue history and the omitted or coreferred expressions. To achieve this, we propose a speaker highlight dialogue history encoder and a top-down hierarchical copy mechanism to generate the complete utterances. We conduct dozens of experiments on the CamRest676 dataset, and the experimental results show that our methods are expert in long-distance dialogue history modeling and can significantly improve the performance of ellipsis and coreference resolution in the dialogue task.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. In: Proceedings of the 3rd International Conference on Learning Representations (2015) Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. In: Proceedings of the 3rd International Conference on Learning Representations (2015)
2.
Zurück zum Zitat Joshi, M., Chen, D., Liu, Y., Weld, D.S., Zettlemoyer, L., Levy, O.: SpanBERT: improving pre-training by representing and predicting spans. Trans. Assoc. Comput. Linguist. 8, 64–77 (2020)CrossRef Joshi, M., Chen, D., Liu, Y., Weld, D.S., Zettlemoyer, L., Levy, O.: SpanBERT: improving pre-training by representing and predicting spans. Trans. Assoc. Comput. Linguist. 8, 64–77 (2020)CrossRef
3.
Zurück zum Zitat Kumar, V., Joshi, S.: Non-sentential question resolution using sequence to sequence learning. In: Proceedings of COLING 2016, pp. 2022–2031 (2016) Kumar, V., Joshi, S.: Non-sentential question resolution using sequence to sequence learning. In: Proceedings of COLING 2016, pp. 2022–2031 (2016)
4.
Zurück zum Zitat Lee, K., He, L., Lewis, M., Zettlemoyer, L.: End-to-end neural coreference resolution. In: EMNLP (2017) Lee, K., He, L., Lewis, M., Zettlemoyer, L.: End-to-end neural coreference resolution. In: EMNLP (2017)
5.
Zurück zum Zitat Lee, K., He, L., Zettlemoyer, L.: Higher-order coreference resolution with coarse-to-fine inference. arXiv preprint arXiv:1804.05392 (2018) Lee, K., He, L., Zettlemoyer, L.: Higher-order coreference resolution with coarse-to-fine inference. arXiv preprint arXiv:​1804.​05392 (2018)
6.
Zurück zum Zitat Liu, Q., Chen, B., Lou, J.G., Zhou, B., Zhang, D.: Incomplete utterance rewriting as semantic segmentation. arXiv preprint arXiv:2009.13166 (2020) Liu, Q., Chen, B., Lou, J.G., Zhou, B., Zhang, D.: Incomplete utterance rewriting as semantic segmentation. arXiv preprint arXiv:​2009.​13166 (2020)
7.
Zurück zum Zitat Liu, Z., Gonzalez, E., Gillick, D.: Exploring the steps of verb phrase ellipsis. In: Proceedings of NAACL (2016) Liu, Z., Gonzalez, E., Gillick, D.: Exploring the steps of verb phrase ellipsis. In: Proceedings of NAACL (2016)
8.
Zurück zum Zitat Nielsen, L.A.: A corpus-based study of verb phrase ellipsis. In: Proceedings of the 6th Annual CLUK Research Colloquium, pp. 109–115 (2003) Nielsen, L.A.: A corpus-based study of verb phrase ellipsis. In: Proceedings of the 6th Annual CLUK Research Colloquium, pp. 109–115 (2003)
9.
Zurück zum Zitat Quan, J., Xiong, D., Webber, B., Hu, C.: GECOR: an end-to-end generative ellipsis and co-reference resolution model for task-oriented dialogue. arXiv preprint arXiv:1909.12086 (2019) Quan, J., Xiong, D., Webber, B., Hu, C.: GECOR: an end-to-end generative ellipsis and co-reference resolution model for task-oriented dialogue. arXiv preprint arXiv:​1909.​12086 (2019)
10.
Zurück zum Zitat Shan, Y., et al.: A contextual hierarchical attention network with adaptive objective for dialogue state tracking. arXiv preprint arXiv:2006.01554 (2020) Shan, Y., et al.: A contextual hierarchical attention network with adaptive objective for dialogue state tracking. arXiv preprint arXiv:​2006.​01554 (2020)
11.
Zurück zum Zitat Shi, Z., Huang, M.: A deep sequential model for discourse parsing on multi-party dialogues. In: Proceedings of the AAAI Conference on Artificial Intelligence, pp. 7007–7014 (2019) Shi, Z., Huang, M.: A deep sequential model for discourse parsing on multi-party dialogues. In: Proceedings of the AAAI Conference on Artificial Intelligence, pp. 7007–7014 (2019)
12.
13.
Zurück zum Zitat Wu, W., Wang, F., Yuan, A., Wu, F., Li, J.: CorefQA: coreference resolution as query-based span prediction. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 6953–6963 (Jul 2020) Wu, W., Wang, F., Yuan, A., Wu, F., Li, J.: CorefQA: coreference resolution as query-based span prediction. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 6953–6963 (Jul 2020)
14.
Zurück zum Zitat Xing, C., Wu, Y., Wu, W., Huang, Y., Zhou, M.: Hierarchical recurrent attention network for response generation. In: Proceedings of the AAAI Conference on Artificial Intelligence (2018) Xing, C., Wu, Y., Wu, W., Huang, Y., Zhou, M.: Hierarchical recurrent attention network for response generation. In: Proceedings of the AAAI Conference on Artificial Intelligence (2018)
15.
Zurück zum Zitat Zhang, H., Lan, Y., Pang, L., Guo, J., Cheng, X.: ReCoSa: detecting the relevant contexts with self-attention for multi-turn dialogue generation. arXiv preprint arXiv:1907.05339 (2019) Zhang, H., Lan, Y., Pang, L., Guo, J., Cheng, X.: ReCoSa: detecting the relevant contexts with self-attention for multi-turn dialogue generation. arXiv preprint arXiv:​1907.​05339 (2019)
16.
Zurück zum Zitat Zhang, X., Li, C., Yu, D., Davidson, S., Yu, Z.: Filling conversation ellipsis for better social dialog understanding. In: Proceedings of the AAAI Conference on Artificial Intelligence, pp. 9587–9595 (2020) Zhang, X., Li, C., Yu, D., Davidson, S., Yu, Z.: Filling conversation ellipsis for better social dialog understanding. In: Proceedings of the AAAI Conference on Artificial Intelligence, pp. 9587–9595 (2020)
Metadaten
Titel
Enhancing Long-Distance Dialogue History Modeling for Better Dialogue Ellipsis and Coreference Resolution
verfasst von
Zixin Ni
Fang Kong
Copyright-Jahr
2021
DOI
https://doi.org/10.1007/978-3-030-88480-2_38