Legendre Memory Units: Continuous-Time Representation in Recurrent Neural Networks by Voelker, A., Kajić, I., & Eliasmith, C. (2019)

Recurrent neural networks
(Voelker et al., 2019)


This paper introduces the LMU recurrent cell. This cell is based on a similar-ish idea from LSTM to maintain a memory hidden state. The main idea of the paper is to make this memory satisfy a set of first order ordinary differential equations. \begin{equation} θ \dot{m}(t) = Am(t) + Bu(t) \end{equation} This system has a solution which represents sliding windows of \(u\) via Legendre polynomials. This new unit is tested on a range of tasks. A memory only task, a permuted MNIST task and a dynamical chaotic system prediction task.


Unfortunately, my understanding of this paper is slightly limited. The approach is interesting and has good properties, however the new RNN cell is tested on a small set of task that independently demonstrate useful properties but not all of them together (e.g. good MNIST prediction + long-term dependency).


Voelker, A., Kaji'c, Ivana, & Eliasmith, C., Legendre Memory Units: Continuous-Time Representation in Recurrent Neural Networks, In H. Wallach, H. Larochelle, A. Beygelzimer, F. d{\textbackslash}textquotesingle {Alch{'e}-Buc}, E. Fox, & R. Garnett (Eds.), Advances in {{Neural Information Processing Systems}} 32 (pp. 15544–15553) (2019). : {Curran Associates, Inc.}.

← Back to Notes