Adaptive Structural Learning Method of Recurrent Deep Belief Network for Time Series Analysis
Deep Belief Network (DBN), which is well known to be a kind of Deep Learning methods, has a deep network architecture that can represent multiple features of input patterns hierarchically. Each layer employs a pre-trained Restricted Boltzmann Machine (RBM). In DBN model including RBMs, we may meet t...
Gespeichert in:
Veröffentlicht in: | Keisoku Jidō Seigyo Gakkai ronbunshū 2018, Vol.54(8), pp.628-639 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng ; jpn |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Deep Belief Network (DBN), which is well known to be a kind of Deep Learning methods, has a deep network architecture that can represent multiple features of input patterns hierarchically. Each layer employs a pre-trained Restricted Boltzmann Machine (RBM). In DBN model including RBMs, we may meet the difficulties in finding the optimal network structure and the best set of weights and threshold values. For the solution, we developed the adaptive structure learning method of DBN that can discover an optimal number of hidden neurons for given input data in a RBM by neuron generation / annihilation algorithm, and hidden layers in DBN by the extension of the algorithm. Moreover, the Long Short-Term Memory (LSTM) model can make an accurate prediction for a time series data set. The network architecture of LSTM has been proposed in various ways and can represent high accurate prediction for the benchmark data set, but the problems related to the optimal structure and some parameters still remains. In this paper, the adaptive structure model of RBM and DBN is applied to the LSTM model and the effectiveness was verified by 10-fold cross validation on benchmark data sets. |
---|---|
ISSN: | 0453-4654 1883-8189 |
DOI: | 10.9746/sicetr.54.628 |