Optimization
Links to this note
- Automatic
differentiation
- Cellular automata
as CNNs
- Genetic algorithms
- Gradient descent
-
Gradient descent for wide two-layer neural networks – I : Global
convergence
- Gradient flow
- Neural network
training
- Notes
on: AI-GAs: AI-generating algorithms, an alternate paradigm for
producing general artificial intelligence by Clune, J.
(2019)
- Notes on:
POET: open-ended coevolution of environments and their optimized
solutions by Wang, R., Lehman, J., Clune, J., & Stanley, K. O.
(2019)
- Notes on:
The geometry of integration in text classification RNNs by Aitken,
K., Ramasesh, V. V., Garg, A., Cao, Y., Sussillo, D., &
Maheswaranathan, N. (2020)
- Program synthesis
- Supervised
learning
- Talk:
Differentiation of black-box combinatorial solvers
-
Talk: The Importance of Open-Endedness in AI and Machine
Learning
- The
Elegance of Optimal Transport
← Back to Notes