Skip to main content
Erschienen in:
Buchtitelbild

2021 | OriginalPaper | Buchkapitel

Coreference Resolution: Are the Eliminated Spans Totally Worthless?

verfasst von : Xin Tan, Longyin Zhang, Guodong Zhou

Erschienen in: Natural Language Processing and Chinese Computing

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Up to date, various neural-based methods have been proposed for joint mention span detection and coreference resolution. However, existing studies on coreference resolution mainly depend on mention representations, while the rest spans in the text are largely ignored and directly eliminated. In this paper, we aim at investigating whether those eliminated spans are totally worthless, or to what extent they can help improve the performance of coreference resolution. To achieve this goal, we propose to refine the representation of mentions with global spans including these eliminated ones leveraged. On this basis, we further introduce an additional loss term in this work to encourage the diversity between different entity clusters. Experimental results on the document-level CoNLL-2012 Shared Task English dataset show that the eliminated spans are indeed useful and our proposed approaches show promising results in coreference resolution.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Bagga, A., Baldwin, B.: Algorithms for scoring coreference chains. In: The First International Conference on Language Resources and Evaluation Workshop on Linguistics Coreference, vol. 1, pp. 563–566. Granada (1998) Bagga, A., Baldwin, B.: Algorithms for scoring coreference chains. In: The First International Conference on Language Resources and Evaluation Workshop on Linguistics Coreference, vol. 1, pp. 563–566. Granada (1998)
2.
4.
Zurück zum Zitat Clark, K., Manning, C.D.: Deep reinforcement learning for mention-ranking coreference models. arXiv preprint arXiv:1609.08667 (2016) Clark, K., Manning, C.D.: Deep reinforcement learning for mention-ranking coreference models. arXiv preprint arXiv:​1609.​08667 (2016)
6.
Zurück zum Zitat Denis, P., Baldridge, J.: A ranking approach to pronoun resolution. In: International Joint Conferences on Artificial Intelligence, vol. 158821593 (2007) Denis, P., Baldridge, J.: A ranking approach to pronoun resolution. In: International Joint Conferences on Artificial Intelligence, vol. 158821593 (2007)
7.
Zurück zum Zitat Devlin, J., Chang, M., Lee, K., Toutanova, K.: Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv: Computation and Language (2018) Devlin, J., Chang, M., Lee, K., Toutanova, K.: Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv:​ Computation and Language (2018)
11.
Zurück zum Zitat Kong, F., Fu, J.: Incorporating structural information for better coreference resolution. In: Proceedings of the 28th IJCAI, pp. 5039–5045. AAAI Press (2019) Kong, F., Fu, J.: Incorporating structural information for better coreference resolution. In: Proceedings of the 28th IJCAI, pp. 5039–5045. AAAI Press (2019)
12.
Zurück zum Zitat Lee, K., He, L., Lewis, M., Zettlemoyer, L.: End-to-end neural coreference resolution. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pp. 188–197. Association for Computational Linguistics, Copenhagen, Denmark, September 2017 Lee, K., He, L., Lewis, M., Zettlemoyer, L.: End-to-end neural coreference resolution. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pp. 188–197. Association for Computational Linguistics, Copenhagen, Denmark, September 2017
14.
Zurück zum Zitat Luo, X.: On coreference resolution performance metrics. In: Proceedings of Human Language Technology Conference and Conference on EMNLP, pp. 25–32. Association for Computational Linguistics, Vancouver, British Columbia, Canada, October 2005. https://www.aclweb.org/anthology/H05-1004 Luo, X.: On coreference resolution performance metrics. In: Proceedings of Human Language Technology Conference and Conference on EMNLP, pp. 25–32. Association for Computational Linguistics, Vancouver, British Columbia, Canada, October 2005. https://​www.​aclweb.​org/​anthology/​H05-1004
16.
Zurück zum Zitat Pennington, J., Socher, R., Manning, C.D.: Glove: Global vectors for word representation, pp. 1532–1543 (2014) Pennington, J., Socher, R., Manning, C.D.: Glove: Global vectors for word representation, pp. 1532–1543 (2014)
18.
Zurück zum Zitat Pradhan, S., Moschitti, A., Xue, N., Uryupina, O., Zhang, Y.: CoNLL-2012 shared task: modeling multilingual unrestricted coreference in OntoNotes. In: Joint Conference on EMNLP and CoNLL - Shared Task, pp. 1–40. Association for Computational Linguistics, Jeju Island, Korea, July 2012. https://www.aclweb.org/anthology/W12-4501 Pradhan, S., Moschitti, A., Xue, N., Uryupina, O., Zhang, Y.: CoNLL-2012 shared task: modeling multilingual unrestricted coreference in OntoNotes. In: Joint Conference on EMNLP and CoNLL - Shared Task, pp. 1–40. Association for Computational Linguistics, Jeju Island, Korea, July 2012. https://​www.​aclweb.​org/​anthology/​W12-4501
19.
Zurück zum Zitat Turian, J., Ratinov, L., Bengio, Y.: Word representations: A simple and general method for semi-supervised learning, pp. 384–394 (2010) Turian, J., Ratinov, L., Bengio, Y.: Word representations: A simple and general method for semi-supervised learning, pp. 384–394 (2010)
20.
Zurück zum Zitat Vilain, M., Burger, J., Aberdeen, J., Connolly, D., Hirschman, L.: A model-theoretic coreference scoring scheme. In: Sixth Message Understanding Conference (MUC-6): Proceedings of a Conference Held in Columbia, Maryland, November 6–8, 1995 (1995). https://www.aclweb.org/anthology/M95-1005 Vilain, M., Burger, J., Aberdeen, J., Connolly, D., Hirschman, L.: A model-theoretic coreference scoring scheme. In: Sixth Message Understanding Conference (MUC-6): Proceedings of a Conference Held in Columbia, Maryland, November 6–8, 1995 (1995). https://​www.​aclweb.​org/​anthology/​M95-1005
21.
Zurück zum Zitat Vinyals, O., Fortunato, M., Jaitly, N.: Pointer networks. In: Advances in Neural Information Processing Systems, pp. 2692–2700 (2015) Vinyals, O., Fortunato, M., Jaitly, N.: Pointer networks. In: Advances in Neural Information Processing Systems, pp. 2692–2700 (2015)
23.
Zurück zum Zitat Wiseman, S., Rush, A.M., Shieber, S.M.: Learning global features for coreference resolution. In: Proceedings of the 2016 Conference of the NAACL: Human Language Technologies, pp. 994–1004. Association for Computational Linguistics, San Diego, California, June 2016 Wiseman, S., Rush, A.M., Shieber, S.M.: Learning global features for coreference resolution. In: Proceedings of the 2016 Conference of the NAACL: Human Language Technologies, pp. 994–1004. Association for Computational Linguistics, San Diego, California, June 2016
25.
Zurück zum Zitat Zhang, R., Nogueira dos Santos, C., Yasunaga, M., Xiang, B., Radev, D.: Neural coreference resolution with deep biaffine attention by joint mention detection and mention clustering. In: Proceedings of the 56th Annual Meeting of the ACL (Volume 2: Short Papers), pp. 102–107. Association for Computational Linguistics, Melbourne, Australia, July 2018 Zhang, R., Nogueira dos Santos, C., Yasunaga, M., Xiang, B., Radev, D.: Neural coreference resolution with deep biaffine attention by joint mention detection and mention clustering. In: Proceedings of the 56th Annual Meeting of the ACL (Volume 2: Short Papers), pp. 102–107. Association for Computational Linguistics, Melbourne, Australia, July 2018
Metadaten
Titel
Coreference Resolution: Are the Eliminated Spans Totally Worthless?
verfasst von
Xin Tan
Longyin Zhang
Guodong Zhou
Copyright-Jahr
2021
DOI
https://doi.org/10.1007/978-3-030-88480-2_1