mBART

tags
Transformers, NLP, BART
paper
(Liu et al. 2020)

Architecture

It’s an encoder-decoder architecture based on BART

Bibliography

  1. . . "Multilingual Denoising Pre-training for Neural Machine Translation". arXiv. DOI.

Comments


← Back to Notes