Multi-output incremental back-propagation
Deep learning techniques can form generalized models that can solve any problem that is not solvable by traditional approaches. It explains the omnipresence of deep learning models across all domains. However, a lot of time is spent on finding the optimal hyperparameters to help the model generalize...
Gespeichert in:
Veröffentlicht in: | Neural computing & applications 2023-07, Vol.35 (20), p.14897-14910 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Deep learning techniques can form generalized models that can solve any problem that is not solvable by traditional approaches. It explains the omnipresence of deep learning models across all domains. However, a lot of time is spent on finding the optimal hyperparameters to help the model generalize and give the highest accuracy. This paper investigates a proposed model incorporating hybrid layers and a novel approach for weight initialization aimed at—(1) Reducing the overall trial and error time spent in finding the optimal number of layers by providing the necessary insights. (2) Reducing the randomness in weight initialization with the help of a novel incremental backpropagation based model architecture. The model, along with the principal component analysis-based initialization, substantially provides a stable weight initialization, thereby improving the train and test performance and speeding up the process of convergence to an optimal solution. Furthermore, three data sets were tested on the proposed approach, and they outperformed the state-of-the-art initialization methods. |
---|---|
ISSN: | 0941-0643 1433-3058 |
DOI: | 10.1007/s00521-023-08490-4 |