Word vectors are abstract representation of words embedded in a dense space.
They are closely related to Language modeling, since the implicit representation a language model builds for prediction can often be used as a word (or sentence) vector.
Word vectors can encode interesting information, such as semantic similarity between words. This can help for text classification tasks as it may be easier to learn a mapping between this intermediate space and a result rather than between the space of one-hot encoded words/sentences.
- Hinrich Schütze. . "Word Space". In Advances in Neural Information Processing Systems 5, 895–902. Morgan Kaufmann.