- tags
- Transformers, GPT, OPT: Open Pre-trained Transformer, NLP
- blog post
- Meta AI announcement blog post
- paper
- (Shuster et al. 2022)
Architecture
It is based on a pre-trained OPT model, with some optimizations to make it better as a dialog agent, such as long term memory and the ability to search the web.
It uses human feedback to fine-tune its results on some tasks.
Parameter count
175B
Bibliography
- Kurt Shuster, Jing Xu, Mojtaba Komeili, Da Ju, Eric Michael Smith, Stephen Roller, Megan Ung, et al.. . "Blenderbot 3: A Deployed Conversational Agent That Continually Learns to Responsibly Engage". arXiv. DOI.