DialoGPT

tags
GPT, Transformers, NLP
paper
(Zhang et al. 2020)

Architecture

It is exactly like a GPT-2 architecture but trained on dialog data.

Parameter count

1.5B

Bibliography

  1. . . "Dialogpt: Large-scale Generative Pre-training for Conversational Response Generation". arXiv. DOI.
Last changed | authored by

Comments


← Back to Notes