Information theory demonstration of the Richardson cascade
Turbulence theory is usually concerned with the statistical moments of the velocity or its fluctuations. One could also analyze the implicit probability distributions. This is the purview of information theory. Here we use information theory, specifically the conditional entropy, to analyze (quasi-)...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Turbulence theory is usually concerned with the statistical moments of the
velocity or its fluctuations. One could also analyze the implicit probability
distributions. This is the purview of information theory. Here we use
information theory, specifically the conditional entropy, to analyze (quasi-)2D
turbulence. We recast Richardson's "eddy hypothesis" that large eddies break up
into small eddies in time in the language of information theory. In addition to
confirming Richardson's idea, we find that self-similarity and turbulent length
scales reappear naturally. Not surprisingly, we also find that the direction of
information transfer is the same as the direction of the cascade itself.
Consequently, intermittency may be considered a necessary companion to all
turbulent flows. |
---|---|
DOI: | 10.48550/arxiv.1602.02980 |