Meta-learning

tags
Machine learning

Constrained meta-learning

(Kirsch and Schmidhuber 2021)

Meta-learning of initialization

The goal is to learn the initialization of neural network parameters or recurrent neural network initial states in order to make the training faster or less prone to getting stuck in local minima.

Example for implicit neural representations: (Tancik et al. 2021)

Meta-learning algorithms

MAML

(Finn, Abbeel, and Levine 2017)

Reptile

(Nichol, Achiam, and Schulman 2018)

Bibliography

  1. . . "Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks". In Proceedings of the 34th International Conference on Machine Learning, 70:10. Sydney, Australia.

  2. . . “Meta Learning Backpropagation and Improving It”. arXiv:2012.14905 [Cs, Stat]. http://arxiv.org/abs/2012.14905.

  3. . . “On First-Order Meta-Learning Algorithms”. arXiv:1803.02999 [Cs]. http://arxiv.org/abs/1803.02999.

  4. . . “Learned Initializations for Optimizing Coordinate-Based Neural Representations”. arXiv:2012.02189 [Cs]. http://arxiv.org/abs/2012.02189.

← Back to Notes