2014 | OriginalPaper | Buchkapitel
DRWS: A Model for Learning Distributed Representations for Words and Sentences
verfasst von : Chunwei Yan, Fan Zhang, Lian’en Huang
Erschienen in: PRICAI 2014: Trends in Artificial Intelligence
Verlag: Springer International Publishing
Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.
Wählen Sie Textabschnitte aus um mit Künstlicher Intelligenz passenden Patente zu finden. powered by
Markieren Sie Textabschnitte, um KI-gestützt weitere passende Inhalte zu finden. powered by
Vector-space distributed representations of words can capture syntactic and semantic regularities in language and help learning algorithms to achieve better performance in natural language processing tasks by grouping similar words. With progress of machine learning techniques in recent years, much attention has been paid on this field. However, many NLP tasks such as text summary and sentence matching treat sentences as atomic units. In this paper, we introduce a new model called DRWS which can learn distributed representations for words and variable-length sentences. Feature vectors for words and sentences are learned based on their probability of co-occurrence between words and sentences using a neural network. To evaluate feature vectors learned by our model, we applied our model on the tasks of detecting word similarity and text summarization. Extensive experiments demonstrate the effectiveness of our proposed model in learning vector representations for words and sentences.