Regression and Multiclass Classification Using Sparse Extreme Learning Machine via Smoothing Group L1/2 Regularizer
Extreme learning machine (ELM) is a simple feedforward neural network, and it has been extensively used in applications for its extremely fast learning speed and good generalization performance. Nevertheless, it is implemented normally under the empirical risk minimization scheme and the model train...
Gespeichert in:
Veröffentlicht in: | IEEE access 2020, Vol.8, p.191482-191494 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Extreme learning machine (ELM) is a simple feedforward neural network, and it has been extensively used in applications for its extremely fast learning speed and good generalization performance. Nevertheless, it is implemented normally under the empirical risk minimization scheme and the model trained by ELM is prone to overfitting. In addition, the ELM provides more nodes than it actually needs, this means that the network structure is not sparse enough. To solve above problems, two efficient algorithms for training ELM: Group L_{1/2} regularization and smoothing Group L_{1/2} regularization methods are proposed in this article. But, the basic group L_{1/2} regularization is nondifferentiable at the origin and which causes oscillation. So, we modify the basic group L_{1/2} regularization by smoothing it at the origin. Simulation results show that the ELM with smoothing group L_{1/2} regularization can effectively prune redundant nodes and redundant weights of the surviving nodes, which has better performance than traditional ELM, the ELM with L_{1} regularization method, and with group L_{1/2} regularization method. |
---|---|
ISSN: | 2169-3536 |
DOI: | 10.1109/ACCESS.2020.3031647 |