BART

tags
Transformers
paper
(Lewis et al. 2019)

Architecture

It is an encoder/decoder architecture. The encoder is based on BERT and the decoder is based on GPT. It generalizes the two models into a single one.

Bibliography

  1. . . "BART: Denoising Sequence-to-sequence Pre-training for Natural Language Generation, Translation, and Comprehension". arXiv. DOI.

Links to this note

Last changed | authored by

Comments


← Back to Notes