Minerva

tags
Transformers, Mathematics, PaLM
paper
(Lewkowycz et al. 2022)

Architecture

This model is PaLM fine-tuned on mathematical datasets.

Parameter count

540B

Bibliography

  1. . . "Solving Quantitative Reasoning Problems with Language Models". arXiv. http://arxiv.org/abs/2206.14858.
Last changed | authored by

Comments


← Back to Notes