Error Analysis of Least-Squares l^} -Regularized Regression Learning Algorithm With the Non-Identical and Dependent Samples

The selection of the penalty functional is critical for the performance of a regularized learning algorithm, and thus l^{q} -regularizer (1\leq q\leq 2) deserves special attention. We consider the regularized least-squares regression learning algorithm for the non-identical and weakly dependent s...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2018-01, Vol.6, p.43824-43829
Hauptverfasser: Guo, Qin, Ye, Peixin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The selection of the penalty functional is critical for the performance of a regularized learning algorithm, and thus l^{q} -regularizer (1\leq q\leq 2) deserves special attention. We consider the regularized least-squares regression learning algorithm for the non-identical and weakly dependent samples. The dependent samples satisfy the polynomially \beta -mixing condition and the sequence of the non-identical sampling marginal measures converges to a probability measure exponentially in the dual of a Hölder space. We conduct the rigorous unified error analysis and derive the satisfactory learning rates of the algorithm by the stepping stone technique in the error decomposition and the independent-blocks technique in the sample error estimates.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2018.2863600