- tags
- Recurrent neural networks
- source
- (Voelker, Kajićand Eliasmith 2019)
Summary
This paper introduces the LMU recurrent cell. This cell is based on a similar-ish idea to LSTM to maintain a memory hidden state. The main idea of the paper is to make this memory satisfy a set of first order ordinary differential equations.
\[
\begin{equation} \theta \dot{m}(t) = Am(t) + Bu(t) \end{equation}
\]
This system has a solution which represents sliding windows of \(u\) via Legendre polynomials. This new unit is tested on a range of tasks. A memory only task, a permuted MNIST task and a dynamical chaotic system prediction task.
Comments
Unfortunately, my understanding of this paper is slightly limited. The approach is interesting and has good properties, however the new RNN cell is tested on a small set of task that independently demonstrate useful properties but not all of them together (e.g. good MNIST prediction + long-term dependency).
Bibliography
- Voelker, Aaron, Ivana Kajićand Chris Eliasmith. 2019. "Legendre Memory Units: Continuous-time Representation in Recurrent Neural Networks". In Advances in Neural Information Processing Systems 32, edited by H. Wallach, H. Larochelle, A. Beygelzimer, F. d\textquotesingle Alché-Buc, E. Fox, and R. Garnett, 15544–53. Curran Associates, Inc.. http://papers.nips.cc/paper/9689-legendre-memory-units-continuous-time-representation-in-recurrent-neural-networks.pdf.