- tags
- Complexity
To study the complexity of various systems, researchers have come up with various metrics. They are based on several principles such as Information theory or Algorithmic Information theory. Many of these metrics are described in (Grassberger 1989).
Shannon entropy and Kolmogorov Complexity
The paper (Grunwald, Vitányi 2004) is a great description and analysis of two of the most important Complexity metrics:
Information-theoretic metrics
AIT based metrics
For a Universal computer \(U\) the algorithmic information of \(S\) relative to \(U\) is defined as the length of the shortest program that yields \(S\) on \(U\). \[ C_U(S) = \min_{Prog_U(S)} \text{Len}[Prog_U(S)] \]
This quantity is called algorithmic information, or Solomonoff–Kolmogorov–Chaitin complexity.
Compression-based metrics
It is not possible for a given sequence to estimate reliably its algorithmic information, because we cannot know if we have found the shortest description. A solution is to restrict the encoding method, and LZW algorithm complexity is one of them (Lempel, Ziv 1976).
Others
- Epsilon machines
- Statistical complexity
- In (Cisneros et al. 2019), we propose a complexity metric which is based on the compression quality of a local neural network based prediction model.
Bibliography
- Peter Grassberger. . "Randomness, Information, and Complexity". In Proceedings of the 5th Mexican School on Statistical Physics. http://arxiv.org/abs/1208.3459.
- Peter Grunwald, Paul Vitányi. . "Shannon Information and Kolmogorov Complexity". Arxiv Preprint Cs/0410002.
- A. Lempel, J. Ziv. . "On the Complexity of Finite Sequences". IEEE Transactions on Information Theory 22 (1):75–81. DOI.
- Hugo Cisneros, Josef Sivic, Tomas Mikolov. . "Evolving Structures in Complex Systems". In 2019 IEEE Symposium Series on Computational Intelligence (SSCI), 230–37. Xiamen, China: IEEE. DOI.