ERNIE

tags
Transformers, BERT, NLP
paper
(Zhang et al. 2019)

Architecture

This transformer uses two stacked BERT for encoding: one for the text, one for the entities in a knowledge graph.

Parameter count

114M

Bibliography

  1. . . "ERNIE: Enhanced Language Representation with Informative Entities". arXiv. DOI.
Last changed | authored by

Comments


← Back to Notes