ELECTRA

tags
Transformers, NLP
paper
(Clark et al. 2020)

Paramter count

  • Base = 110M
  • Large = 330M

Bibliography

  1. . . "ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators". In 8th International Conference on Learning Representations, ICLR 2020, Addis Ababa, Ethiopia, April 26-30, 2020. OpenReview.net. https://openreview.net/forum?id=r1xMH1BtvB.
Last changed | authored by

Comments


← Back to Notes