Complexity metrics

tags
Complexity

To study the complexity of various systems, researchers have come up with various metrics. They are based on several principles such as Information theory or Algorithmic Information theory. Many of these metrics are described in (Grassberger 1989).

Shannon entropy and Kolmogorov Complexity

The paper (Grunwald and Vitanyi, n.d.) is a great description and analysis of two of the most important Complexity metrics:

Information-theoretic metrics

AIT based metrics

For a Universal computer \(U\) the algorithmic information of \(S\) relative to \(U\) is defined as the length of the shortest program that yields \(S\) on \(U\). \[ C_U(S) = \min_{Prog_U(S)} \text{Len}[Prog_U(S)] \]

This quantity is called algorithmic information, or Solomonoff–Kolmogorov–Chaitin complexity.

Compression-based metrics

It is not possible for a given sequence to estimate reliably its algorithmic information, because we cannot know if we have found the shortest description. A solution is to restrict the encoding method, and LZW algorithm complexity is one of them (Lempel and Ziv 1976).

Others

Bibliography

Grassberger, Peter. 1989. “Randomness, Information, and Complexity.” In Proceedings of the 5th Mexican School on Statistical Physics.

Grunwald, Peter, and Paul Vitanyi. n.d. “Shannon Information and Kolmogorov Complexity,” 51.

Lempel, A., and J. Ziv. 1976. “On the Complexity of Finite Sequences.” IEEE Transactions on Information Theory 22 (1):75–81.


← Back to Notes