- tags
- NLP
Definition
Word vectors are abstract representation of words embedded in a dense space.
They are closely related to Language modeling, since the implicit representation a language model builds for prediction can often be used as a word (or sentence) vector.
Word vectors can be extracted from the intermediate representations of RNNs or transformers. They can also be created with dedicated algorithms such as Word2Vec.
Usage
Word vectors can encode interesting information, such as semantic similarity between words. This can help for text classification tasks as it may be easier to learn a mapping between this intermediate space and a result rather than between the space of one-hot encoded words/sentences.
Bibliography
- Hinrich Schütze. . "Word Space". In Advances in Neural Information Processing Systems 5, 895–902. Morgan Kaufmann.