Meta-learning

tags
Machine learning

Constrained meta-learning

(Kirsch, Schmidhuber 2021)

Meta-learning of initialization

The goal is to learn the initialization of neural network parameters or recurrent neural network initial states in order to make the training faster or less prone to getting stuck in local minima.

Example for implicit neural representations: (Tancik et al. 2021)

Meta-learning algorithms

MAML

(Finn et al. 2017)

Reptile

(Nichol et al. 2018)

Bibliography

  1. . . "Meta Learning Backpropagation and Improving It". Arxiv:2012.14905 [cs, Stat]. http://arxiv.org/abs/2012.14905.
  2. . . "Learned Initializations for Optimizing Coordinate-based Neural Representations". Arxiv:2012.02189 [cs]. http://arxiv.org/abs/2012.02189. See notes
  3. . . "Model-agnostic Meta-learning for Fast Adaptation of Deep Networks". In Proceedings of the 34th International Conference on Machine Learning, 70:10. Sydney, Australia.
  4. . . "On First-order Meta-learning Algorithms". Arxiv:1803.02999 [cs]. http://arxiv.org/abs/1803.02999.
Last changed | authored by

Comments


← Back to Notes