Kernel Correntropy Conjugate Gradient Algorithms Based on Half-Quadratic Optimization
As a nonlinear similarity measure defined in the kernel space, the correntropic loss (C-Loss) can address the stability issues of second-order similarity measures thanks to its ability to extract high-order statistics of data. However, the kernel adaptive filter (KAF) based on the C-Loss uses the st...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on cybernetics 2021-11, Vol.51 (11), p.5497-5510 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | As a nonlinear similarity measure defined in the kernel space, the correntropic loss (C-Loss) can address the stability issues of second-order similarity measures thanks to its ability to extract high-order statistics of data. However, the kernel adaptive filter (KAF) based on the C-Loss uses the stochastic gradient descent (SGD) method to update its weights and, thus, suffers from poor performance and a slow convergence rate. To address these issues, the conjugate gradient (CG)-based correntropy algorithm is developed by solving the combination of half-quadratic (HQ) optimization and weighted least-squares (LS) problems, generating a novel robust kernel correntropy CG (KCCG) algorithm. The proposed KCCG with less computational complexity achieves comparable performance to the kernel recursive maximum correntropy (KRMC) algorithm. To further curb the growth of the network in KCCG, the random Fourier features KCCG (RFFKCCG) algorithm is proposed by transforming the original input data into a fixed-dimensional random Fourier features space (RFFS). Since only one current error information is used in the loss function of RFFKCCG, it can provide a more efficient filter structure than the other KAFs with sparsification. The Monte Carlo simulations conducted in the prediction of synthetic and real-world chaotic time series and the regression for large-scale datasets validate the superiorities of the proposed algorithms in terms of robustness, filtering accuracy, and complexity. |
---|---|
ISSN: | 2168-2267 2168-2275 |
DOI: | 10.1109/TCYB.2019.2959834 |