Text embeddings

tags
Machine learning, NLP

Dense vector representations of text for retrieval, similarity, and classification tasks.

Matching embeddings in vector space can be optimized for high Semantic similarity.

Transformers have been used to create text embeddings, using encoder-decoded architecture in particular, such as BERT.

Links to this note

Last changed | authored by

Comments

Loading comments...

Leave a comment

Back to Notes