An Implicit Form of Krasulina's k-PCA Update without the Orthonormality Constraint
We shed new insights on the two commonly used updates for the online $k$-PCA problem, namely, Krasulina's and Oja's updates. We show that Krasulina's update corresponds to a projected gradient descent step on the Stiefel manifold of the orthonormal $k$-frames, while Oja's update...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We shed new insights on the two commonly used updates for the online $k$-PCA
problem, namely, Krasulina's and Oja's updates. We show that Krasulina's update
corresponds to a projected gradient descent step on the Stiefel manifold of the
orthonormal $k$-frames, while Oja's update amounts to a gradient descent step
using the unprojected gradient. Following these observations, we derive a more
\emph{implicit} form of Krasulina's $k$-PCA update, i.e. a version that uses
the information of the future gradient as much as possible. Most interestingly,
our implicit Krasulina update avoids the costly QR-decomposition step by
bypassing the orthonormality constraint. We show that the new update in fact
corresponds to an online EM step applied to a probabilistic $k$-PCA model. The
probabilistic view of the updates allows us to combine multiple models in a
distributed setting. We show experimentally that the implicit Krasulina update
yields superior convergence while being significantly faster. We also give
strong evidence that the new update can benefit from parallelism and is more
stable w.r.t. tuning of the learning rate. |
---|---|
DOI: | 10.48550/arxiv.1909.04803 |