On the generic increase of observational entropy in isolated systems
Observational entropy -- a quantity that unifies Boltzmann's entropy, Gibbs' entropy, von Neumann's macroscopic entropy, and the diagonal entropy -- has recently been argued to play a key role in a modern formulation of statistical mechanics. Here, relying on algebraic techniques take...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Observational entropy -- a quantity that unifies Boltzmann's entropy, Gibbs'
entropy, von Neumann's macroscopic entropy, and the diagonal entropy -- has
recently been argued to play a key role in a modern formulation of statistical
mechanics. Here, relying on algebraic techniques taken from Petz's theory of
statistical sufficiency and on a L\'evy-type concentration bound, we prove
rigorous theorems showing how the observational entropy of a system undergoing
a unitary evolution chosen at random tends to increase with overwhelming
probability and to reach its maximum very quickly. More precisely, we show that
for any observation that is sufficiently coarse with respect to the size of the
system, regardless of the initial state of the system (be it pure or mixed),
random evolution renders its state practically indistinguishable from the
uniform (i.e., maximally mixed) distribution with a probability approaching one
as the size of the system grows. The same conclusion holds not only for random
evolutions sampled according to the unitarily invariant Haar distribution, but
also for approximate 2-designs, which are thought to provide a more physically
and computationally reasonable model of random evolutions. |
---|---|
DOI: | 10.48550/arxiv.2404.11985 |