Robust least-squares estimation with a relative entropy constraint

Given a nominal statistical model, we consider the minimax estimation problem consisting of finding the best least-squares estimator for the least favorable statistical model within a neighborhood of the nominal model. The neighborhood is formed by placing a bound on the Kullback-Leibler (KL) diverg...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on information theory 2004-01, Vol.50 (1), p.89-104
Hauptverfasser: Levy, B.C., Nikoukhah, R.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Given a nominal statistical model, we consider the minimax estimation problem consisting of finding the best least-squares estimator for the least favorable statistical model within a neighborhood of the nominal model. The neighborhood is formed by placing a bound on the Kullback-Leibler (KL) divergence between the actual and nominal models. For a Gaussian nominal model and a finite observations interval, or for a stationary Gaussian process over an infinite interval, the usual noncausal Wiener filter remains optimal. However, the worst case performance of the filter is affected by the size of the neighborhood representing the model uncertainty. On the other hand, standard causal least-squares estimators are not optimal, and a characterization is provided for the causal estimator and the corresponding least favorable model. The causal estimator takes the form of a risk-sensitive estimator with an appropriately selected risk sensitivity coefficient.
ISSN:0018-9448
1557-9654
DOI:10.1109/TIT.2003.821992