Monotonicity of the Trace-Inverse of Covariance Submatrices and Two-Sided Prediction

It is common to assess the "memory strength" of a stationary process by looking at how fast the normalized log-determinant of its covariance submatrices (i.e., entropy rate) decreases. In this work, we propose an alternative characterization in terms of the normalized trace-inverse of the...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on information theory 2022-04, Vol.68 (4), p.2767-2781
Hauptverfasser: Khina, Anatoly, Yeredor, Arie, Zamir, Ram
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:It is common to assess the "memory strength" of a stationary process by looking at how fast the normalized log-determinant of its covariance submatrices (i.e., entropy rate) decreases. In this work, we propose an alternative characterization in terms of the normalized trace-inverse of the covariance submatrices. We show that this sequence is monotonically non-decreasing and is constant if and only if the process is white. Furthermore, while the entropy rate is associated with one-sided prediction errors (present from past), the new measure is associated with two-sided prediction errors (present from past and future). Minimizing this measure is then used as an alternative to Burg's maximum-entropy principle for spectral estimation. We also propose a counterpart for non-stationary processes, by looking at the average trace-inverse of subsets.
ISSN:0018-9448
1557-9654
DOI:10.1109/TIT.2021.3131912