Recurrent neural networks
Links to this note
-
Adaptive Computation Time
-
Attractor networks
-
Backward RNN
-
Cellular automata as recurrent neural networks
-
Echo-state networks
-
Hopfield Networks
-
Language modeling
-
Meta-learning
-
Neural architecture search
-
Neural networks as dynamical systems
-
Notes on: Adapting to Unseen Environments through Explicit Representation of Context by Tutum, C., & Miikkulainen, R. (2020)
-
Notes on: Legendre Memory Units: Continuous-Time Representation in Recurrent Neural Networks by Voelker, A., Kajić, I., & Eliasmith, C. (2019)
-
Notes on: Modeling systems with internal state using evolino by Wierstra, D., Gomez, F. J., & Schmidhuber, J. (2005)
-
Notes on: Neural Architecture Search with Reinforcement Learning by Zoph, B., & Le, Q. V. (2017)
-
Notes on: Neural Circuit Policies Enabling Auditable Autonomy by Lechner, M., Hasani, R., Amini, A., Henzinger, T. A., Rus, D., & Grosu, R. (2020)
-
Notes on: Next Generation Reservoir Computing by Gauthier, D. J., Bollt, E., Griffith, A., & Barbosa, W. A. S. (2021)
-
Notes on: Pretrained Transformers as Universal Computation Engines by Lu, K., Grover, A., Abbeel, P., & Mordatch, I. (2021)
-
Notes on: The geometry of integration in text classification RNNs by Aitken, K., Ramasesh, V. V., Garg, A., Cao, Y., Sussillo, D., & Maheswaranathan, N. (2020)
-
Notes on: Transformers are RNNs: Fast Autoregressive Transformers with Linear Attention by Katharopoulos, A., Vyas, A., Pappas, N., & Fleuret, F. (2020)
-
Reservoir computing
-
Schmidhuber on Consciousness
-
Word vectors
Comments
← Back to Notes