Design of Hierarchical Neural Networks Using Deep LSTM and Self-Organizing Dynamical Fuzzy-Neural Network Architecture
Time series forecasting is an essential and challenging task, especially for large-scale time-series (LSTS) forecasting, which plays a crucial role in many real-world applications. Due to the instability of time series data and the randomness (noise) of their characteristics, it is difficult for pol...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on fuzzy systems 2024-05, Vol.32 (5), p.2915-2929 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Time series forecasting is an essential and challenging task, especially for large-scale time-series (LSTS) forecasting, which plays a crucial role in many real-world applications. Due to the instability of time series data and the randomness (noise) of their characteristics, it is difficult for polynomial neural network (PNN) and its modifications to achieve accurate and stable time series prediction. In this study, we propose a novel structure of hierarchical neural networks (HNN) realized by long short-term memory (LSTM), two classes of self-organizing dynamical fuzzy neural network architectures of fuzzy rule-based polynomial neurons (FPNs) and polynomial neurons constructed by variant generation of nodes as well as layers of networks. The proposed HNN combines the deep learning method with the PNN method for the first time and extends it to time series prediction as a modification of PNN. LSTM extracts the temporal dependencies present in each time series and enables the model to learn its representation. FPNs are designed to capture the complex nonlinear patterns present in the data space by utilizing fuzzy C-means (FCM) clustering and least-square-error-based learning of polynomial functions. The self-organizing hierarchical network architecture generated by the Elitism-based Roulette Wheel Selection strategy ensures that candidate neurons exhibit sufficient fitting ability while enriching the diversity of heterogeneous neurons, addressing the issue of multicollinearity and providing opportunities to select better prediction neurons. In addition, L 2 -norm regularization is applied to mitigate the overfitting problem. Experiments are conducted on nine real-world LSTS datasets including three practical applications. The results show that the proposed model exhibits high prediction performance, outperforming many state-of-the-art models. |
---|---|
ISSN: | 1063-6706 1941-0034 |
DOI: | 10.1109/TFUZZ.2024.3361856 |