Recursive Maximum Likelihood Algorithm for Dependent Observations

A recursive maximum-likelihood algorithm (RML) is proposed that can be used when both the observations and the hidden data have continuous values and are statistically dependent between different time samples. The algorithm recursively approximates the probability density functions of the observed a...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on signal processing 2019-03, Vol.67 (5), p.1366-1381
Hauptverfasser: Schwartz, Boaz, Gannot, Sharon, Habets, Emanuel A. P., Noam, Yair
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:A recursive maximum-likelihood algorithm (RML) is proposed that can be used when both the observations and the hidden data have continuous values and are statistically dependent between different time samples. The algorithm recursively approximates the probability density functions of the observed and hidden data by analytically computing the integrals with respect to the state variables, where the parameters are updated using gradient steps. A full convergence proof is given, based on the ordinary differential equation approach, which shows that the algorithm converges to a local minimum of the Kullback-Leibler divergence between the true and the estimated parametric probability density functions-a result that is useful even for a miss-specified parametric model. Compared to other RML algorithms proposed in the literature, this contribution extends the state-space model and provides a theoretical analysis in a nontrivial statistical model that was not analyzed so far. We further extend the RML analysis to constrained parameter estimation problems. Two examples, including nonlinear state-space models, are given to highlight this contribution.
ISSN:1053-587X
1941-0476
DOI:10.1109/TSP.2018.2889945