# Legendre Memory Units: Continuous-Time Representation in Recurrent Neural Networks by Voelker, A., Kajić, I., & Eliasmith, C. (2019)

tags
Recurrent neural networks
source
(Voelker, Kajić, and Eliasmith 2019)

## Summary

This paper introduces the LMU recurrent cell. This cell is based on a similar-ish idea to LSTM to maintain a memory hidden state. The main idea of the paper is to make this memory satisfy a set of first order ordinary differential equations.

$$$\theta \dot{m}(t) = Am(t) + Bu(t)$$$

This system has a solution which represents sliding windows of $$u$$ via Legendre polynomials. This new unit is tested on a range of tasks. A memory only task, a permuted MNIST task and a dynamical chaotic system prediction task.