A two-step proximal-point algorithm for the calculus of divergence-based estimators in finite mixture models

Estimators derived from the expectation-maximization (EM) algorithm are not robust since they are based on the maximization of the likelihood function. We propose an iterative proximal-point algorithm based on the EM algorithm which aims to minimize a divergence criterion between a mixture model and...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Canadian journal of statistics 2019-04, Vol.47 (3), p.392-408
Hauptverfasser: Bronialowski, Michel, Al Mohamad, Diaa
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Estimators derived from the expectation-maximization (EM) algorithm are not robust since they are based on the maximization of the likelihood function. We propose an iterative proximal-point algorithm based on the EM algorithm which aims to minimize a divergence criterion between a mixture model and some unknown distribution generating the data. The algorithm estimates in each iteration the proportions and the parameters of the mixture components in two separate steps. Resulting estimators are generally robust against outliers and misspecification. Convergence properties of our algorithm are treated. The convergence of the introduced algorithm is discussed on a two-component Weibull mixture entailing a condition on the initialization of the EM algorithm in order for the latter to converge. Simulations on Gaussian and Weibull mixture models using different statistical divergences are provided to confirm the validity of our work and the robustness of the resulting estimators against outliers in comparison to the EM
ISSN:0319-5724
DOI:10.1002/cjs.11500