A weights direct determination neuronet for time‐series with applications in the industrial indices of the Federal Reserve Bank of St. Louis
The shortcomings of conventional back‐propagation neuronets, such as slow training speed and local minimum, are known to be addressed by neuronets trained under the weights‐and‐structure‐determination (WASD) algorithm. Derived from power activation feed‐forward neuronets, a multi‐input WASD for time...
Gespeichert in:
Veröffentlicht in: | Journal of forecasting 2022-11, Vol.41 (7), p.1512-1524 |
---|---|
1. Verfasser: | |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The shortcomings of conventional back‐propagation neuronets, such as slow training speed and local minimum, are known to be addressed by neuronets trained under the weights‐and‐structure‐determination (WASD) algorithm. Derived from power activation feed‐forward neuronets, a multi‐input WASD for time‐series neuronet (MI‐WASDTSN) model is presented in this paper. The MI‐WASDTSN is equipped with a novel WASD for time‐series (WASDTS) algorithm, for handling time‐series modeling and forecasting problems. Employing a power sigmoid activation function, the WASDTS algorithm handles the model fitting and validation by determining the optimal input variables number and the weights of the MI‐WASDTSN. More specifically, the WASDTS algorithm finds and holds only the activation function powers that reduce the model's error during validation. Applications on Federal Reserve Bank of St. Louis (FRED) industrial indices under three different patterns of time‐series validate our MI‐WASDTSN model in order to demonstrate its outstanding learning and forecasting performance. In addition, to support and advance the findings of this work, we created a MATLAB repository for interested users, which is freely available via GitHub. |
---|---|
ISSN: | 0277-6693 1099-131X |
DOI: | 10.1002/for.2874 |