2020 | OriginalPaper | Buchkapitel
Word, Sense, and Graph Embeddings
verfasst von : Jose Manuel Gomez-Perez, Ronald Denaux, Andres Garcia-Silva
Erschienen in: A Practical Guide to Hybrid Natural Language Processing
Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.
Wählen Sie Textabschnitte aus um mit Künstlicher Intelligenz passenden Patente zu finden. powered by
Markieren Sie Textabschnitte, um KI-gestützt weitere passende Inhalte zu finden. powered by
Distributed word representations in the form of dense vectors, known as word embeddings, are the basic building blocks for machine-learning based natural language processing. Such embeddings play an important role in tasks such as part-of-speech tagging, chunking, named entity recognition, and semantic role labeling, as well as downstream tasks including sentiment analysis and more in general text classification. However, early word embeddings were static context-independent representations that fail to capture multiple meanings for polysemous words. This chapter presents an overview of such traditional word embeddings, but also of alternative approaches that have been proposed to produce sense and concept embeddings using disambiguated corpora or directly from knowledge graphs. As a result, this chapter serves as a conceptual framework for the rest of book.