# Word vectors

tags
NLP

## Definition

Word vectors are abstract representation of words embedded in a dense space.

They are closely related to Language modeling, since the implicit representation a language model builds for prediction can often be used as a word (or sentence) vector.

## Usage

Word vectors can encode interesting information, such as semantic similarity between words. This can help for text classification tasks as it may be easier to learn a mapping between this intermediate space and a result rather than between the space of one-hot encoded words/sentences.

## Bibliography

1. . . "Word Space". In Advances in Neural Information Processing Systems 5, 895–902. Morgan Kaufmann.