Information erasure lurking behind measures of complexity
Complex systems are found in most branches of science. It is still argued how to best quantify their complexity and to what end. One prominent measure of complexity (the statistical complexity) has an operational meaning in terms of the amount of resources needed to forecasting a system's behav...
Gespeichert in:
Veröffentlicht in: | arXiv.org 2011-10 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Complex systems are found in most branches of science. It is still argued how to best quantify their complexity and to what end. One prominent measure of complexity (the statistical complexity) has an operational meaning in terms of the amount of resources needed to forecasting a system's behaviour. Another one (the effective measure complexity, aka excess entropy) is a measure of mutual information stored in the system proper. We show that for any given system the two measures differ by the amount of information erased during forecasting. We interpret the difference as inefficiency of a given model. We find a bound to the ratio of the two measures defined as information-processing efficiency, in analogy to the second law of thermodynamics. This new link between two prominent measures of complexity provides a quantitative criterion for good models of complex systems, namely those with little information erasure. |
---|---|
ISSN: | 2331-8422 |