- tags
- Complexity
To study the complexity of various systems, researchers have come up with various metrics. They are based on several principles such as Information theory or Algorithmic Information theory. Many of these metrics are described in (Grassberger 1989).
Shannon entropy and Kolmogorov Complexity
The paper (Grunwald and Vitanyi, n.d.) is a great description and analysis of two of the most important Complexity metrics:
Information-theoretic metrics
AIT based metrics
For a Universal computer \(U\) the algorithmic information of \(S\) relative to \(U\) is defined as the length of the shortest program that yields \(S\) on \(U\). \[ C_U(S) = \min_{Prog_U(S)} \text{Len}[Prog_U(S)] \]
This quantity is called algorithmic information, or Solomonoff–Kolmogorov–Chaitin complexity.
Compression-based metrics
It is not possible for a given sequence to estimate reliably its algorithmic information, because we cannot know if we have found the shortest description. A solution is to restrict the encoding method, and LZW algorithm complexity is one of them (Lempel and Ziv 1976).
Others
- Epsilon machines
- Statistical complexity
- In (Cisneros, Sivic, and Mikolov 2019), we propose a complexity metric which is based on the compression quality of a local neural network based prediction model.
Bibliography
- Cisneros, Hugo, Josef Sivic, and Tomas Mikolov. December 2019. "Evolving Structures in Complex Systems". In 2019 IEEE Symposium Series on Computational Intelligence (SSCI), 230–37. Xiamen, China: IEEE.
- Grunwald, Peter, and Paul Vitanyi. n.d. “Shannon Information and Kolmogorov Complexity”, 51.
- Lempel, A., and J. Ziv. January 1976. “On the Complexity of Finite Sequences”. IEEE Transactions on Information Theory 22 (1):75–81.
- Grassberger, Peter. 1989. “Randomness, Information, and Complexity”. In Proceedings of the 5th Mexican School on Statistical Physics.