New Derivation for Gaussian Mixture Model Parameter Estimation: MM Based Approach

In this letter, we revisit the problem of maximum likelihood estimation (MLE) of parameters of Gaussian Mixture Model (GMM) and show a new derivation for its parameters. The new derivation, unlike the classical approach employing the technique of expectation-maximization (EM), is straightforward and...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Sahu, Nitesh, Babu, Prabhu
Format: Artikel
Sprache:eng
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this letter, we revisit the problem of maximum likelihood estimation (MLE) of parameters of Gaussian Mixture Model (GMM) and show a new derivation for its parameters. The new derivation, unlike the classical approach employing the technique of expectation-maximization (EM), is straightforward and doesn't invoke any hidden or latent variables and calculation of the conditional density function. The new derivation is based on the approach of minorization-maximization and involves finding a tighter lower bound of the log-likelihood criterion. The update steps of the parameters, obtained via the new derivation, are same as the update steps obtained via the classical EM algorithm.
DOI:10.48550/arxiv.2001.02923