Debates: Does Information Theory Provide a New Paradigm for Earth Science? Sharper Predictions Using Occam's Digital Razor

Occam's Razor is a bedrock principle of science philosophy, stating that the simplest hypothesis (or model) is preferred, at any given level of model predictive performance. A modern restatement often attributed to Einstein explains, “Everything should be made as simple as possible, but not sim...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Water resources research 2020-02, Vol.56 (2), p.n/a
Hauptverfasser: Weijs, Steven. V., Ruddell, Benjamin. L.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Occam's Razor is a bedrock principle of science philosophy, stating that the simplest hypothesis (or model) is preferred, at any given level of model predictive performance. A modern restatement often attributed to Einstein explains, “Everything should be made as simple as possible, but not simpler.” Using principles from (algorithmic) information theory, both model descriptive performance and model complexity can be quantified in bits. This quantification yields a Pareto‐style trade‐off between model complexity (length of the model program in bits) and model performance (information loss in bits, or the missing information, needed to describe the original observations). Model complexity and performance can be collapsed to one single measure of lossless model size, which, when minimized, leads to optimal model complexity versus loss trade‐off for generalization and prediction. Our view puts both simple data‐driven and complex physical‐process‐based models on a continuum, in the sense that both describe patterns in observed data in compressed form, with different degrees of generality, model complexity, and descriptive performance. Information theory‐based assessment of compression performance with fair and meaningful accounting for model complexity will enable us to best compare and combine the strengths of physics knowledge and data‐driven modeling for a given problem, given the availability of data. “Suppose we draw a set of points on paper in a totally random manner” … “I am saying it is possible to find a geometric line whose notation is constant and uniform, following a certain law, that will pass through all points, and in the same order they were drawn.” … “But if that law is strongly composed, the thing that conforms to it should be seen as irregular”Gottfried Wilhelm Leibniz, 1686: Discours de métaphysique V, VI (from French) Key Points Information theory provides a powerful framework to measure and optimize model complexity versus performance (loss) in the same unit: bits Modeling observed reality is data compression; its success can be measured by single objective: efficiency in compression of observations Quantification of complexity allows fairer comparison of performance between physical process models and data‐driven statistical models
ISSN:0043-1397
1944-7973
DOI:10.1029/2019WR026471