Hierarchical Particle Swarm Optimization-incorporated Latent Factor Analysis for Large-Scale Incomplete Matrices
A Stochastic Gradient Descent (SGD)-based Latent Factor Analysis (LFA) model is highly efficient in representative learning on a High-Dimensional and Sparse (HiDS) matrix, where the learning rate adaptation is vital in its efficiency and practicability. The learning rate adaptation of an SGD-based L...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on big data 2022-12, Vol.8 (6), p.1524-1536 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | A Stochastic Gradient Descent (SGD)-based Latent Factor Analysis (LFA) model is highly efficient in representative learning on a High-Dimensional and Sparse (HiDS) matrix, where the learning rate adaptation is vital in its efficiency and practicability. The learning rate adaptation of an SGD-based LFA model can be achieved efficiently by learning rate evolution with an evolutionary computing algorithm. However, a resultant model commonly suffers from twofold premature convergence issues, i.e., a) the premature convergence of the learning rate swarm relying on an evolution algorithm, and b) the premature convergence of an LFA model relying on the compound effects of evolution-based learning rate adaptation and adopted optimization algorithm. Aiming at addressed such issues, this work proposes an H ierarchical P article swarm optimization-incorporated L atent factor analysis (HPL) model with a two-layered structure. The first layer pre-trains desired latent factors with a position-transitional particle swarm optimization-based LFA model with learning rate adaptation; while the second layer performs latent factor refinement with a newly-proposed mini-batch particle swarm optimization algorithm. Experimental results on four HiDS matrices generated by industrial applications demonstrate that an HPL model can well handle the mentioned premature convergence issues, thereby achieving highly-accurate representation to HiDS matrices. |
---|---|
ISSN: | 2332-7790 2372-2096 |
DOI: | 10.1109/TBDATA.2021.3090905 |