Investigating Different Syntactic Context Types and Context Representations for Learning Word Embeddings

Bofang Li, Tao Liu, Zhe Zhao, Buzhou Tang, Aleksandr Drozd, Anna Rogers, Xiaoyong Du


Abstract
The number of word embedding models is growing every year. Most of them are based on the co-occurrence information of words and their contexts. However, it is still an open question what is the best definition of context. We provide a systematical investigation of 4 different syntactic context types and context representations for learning word embeddings. Comprehensive experiments are conducted to evaluate their effectiveness on 6 extrinsic and intrinsic tasks. We hope that this paper, along with the published code, would be helpful for choosing the best context type and representation for a given task.
Anthology ID:
D17-1257
Volume:
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
Month:
September
Year:
2017
Address:
Copenhagen, Denmark
Editors:
Martha Palmer, Rebecca Hwa, Sebastian Riedel
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
2421–2431
Language:
URL:
https://aclanthology.org/D17-1257
DOI:
10.18653/v1/D17-1257
Bibkey:
Cite (ACL):
Bofang Li, Tao Liu, Zhe Zhao, Buzhou Tang, Aleksandr Drozd, Anna Rogers, and Xiaoyong Du. 2017. Investigating Different Syntactic Context Types and Context Representations for Learning Word Embeddings. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pages 2421–2431, Copenhagen, Denmark. Association for Computational Linguistics.
Cite (Informal):
Investigating Different Syntactic Context Types and Context Representations for Learning Word Embeddings (Li et al., EMNLP 2017)
Copy Citation:
PDF:
https://aclanthology.org/D17-1257.pdf
Data
IMDb Movie Reviews