Text embeddings

tags
Machine learning, NLP

Dense vector representations of text for retrieval, similarity, and classification tasks.

Matching embeddings in vector space can be optimized for high Semantic similarity.

Learning text embeddings

Transformers have been used to create text embeddings, using encoder-decoded architecture in particular, such as BERT.

Contrastive learning is one of the dominant training method for producing high-quality text embeddings (SimCSE, E5, BGE).

Last changed | authored by

Comments

Loading comments...

Leave a comment

Back to Notes