Compression with Neural networks

Compression can be done with the help of neural networks as estimators of the sequence’s next character probability (Schmidhuber and Heil 1996).

Compression as a measure of Artificial Intelligence

Mahoney argues in (Mahoney 1999) that being able to compress information amounts to being able to predict optimally the distribution of an inputs natural language corpus. A good compression algorithm “learns” features of the language to make better predictions. This is maybe one of the original motivation for language modeling, and can be used as an artificial intelligence test.

Compression as a measure of Complexity

Compression algorithms have long been used as a way of measuring complexity of data.


Mahoney, Matthew V. 1999. “Text Compression as a Test for Artificial Intelligence.” In Proceedings of AAAI-1999, 3.

Schmidhuber, J., and S. Heil. 1996. “Sequential Neural Text Compression.” IEEE Transactions on Neural Networks 7 (1):142–46.

← Back to Notes