- tags
- Machine learning

## Constrained meta-learning

## Meta-learning of initialization

The goal is to learn the initialization of neural network parameters or recurrent neural network initial states in order to make the training faster or less prone to getting stuck in local minima.

Example for implicit neural representations: (Tancik et al. 2021)

## Meta-learning algorithms

### MAML

### Reptile

## Bibliography

- Louis Kirsch, Jürgen Schmidhuber. . "Meta Learning Backpropagation and Improving It".
*Arxiv:2012.14905 [cs, Stat]*. http://arxiv.org/abs/2012.14905. -
Matthew Tancik, Ben Mildenhall, Terrance Wang, Divi Schmidt, Pratul P. Srinivasan, Jonathan T. Barron, Ren Ng. . "Learned Initializations for Optimizing Coordinate-based Neural Representations".
*Arxiv:2012.02189 [cs]*. http://arxiv.org/abs/2012.02189. See notes - Chelsea Finn, Pieter Abbeel, Sergey Levine. . "Model-agnostic Meta-learning for Fast Adaptation of Deep Networks". In
*Proceedings of the 34th International Conference on Machine Learning*, 70:10. Sydney, Australia. - Alex Nichol, Joshua Achiam, John Schulman. . "On First-order Meta-learning Algorithms".
*Arxiv:1803.02999 [cs]*. http://arxiv.org/abs/1803.02999.