A pruning feedforward small-world neural network based on Katz centrality for nonlinear system modeling

Approaching to the biological neural network, small-world neural networks have been demonstrated to improve the generalization performance of artificial neural networks. However, the architecture of small-world neural networks is typically large and predefined. This may cause the problems of overfit...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Neural networks 2020-10, Vol.130, p.269-285
Hauptverfasser: Li, Wenjing, Chu, Minghui, Qiao, Junfei
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Approaching to the biological neural network, small-world neural networks have been demonstrated to improve the generalization performance of artificial neural networks. However, the architecture of small-world neural networks is typically large and predefined. This may cause the problems of overfitting and time consuming, and cannot obtain an optimal network structure automatically for a given problem. To solve the above problems, this paper proposes a pruning feedforward small-world neural network (PFSWNN), and applies it to nonlinear system modeling. Firstly, a feedforward small-world neural network (FSWNN) is constructed according to the rewiring rule of Watts–Strogatz. Secondly, the importance of each hidden neuron is evaluated based on its Katz centrality. If the Katz centrality of a hidden neuron is below the predefined threshold, this neuron is considered to be an unimportant node and then merged with its most correlated neuron in the same hidden layer. The connection weights are trained using the gradient-based algorithm, and the convergence of the proposed PFSWNN is theoretically analyzed in this paper. Finally, the PFSWNN model is tested on some problems for nonlinear system modeling, including the approximation for a rapidly changing function, CATS missing time-series prediction, four benchmark problems of UCI public datasets and a practical problem for wastewater treatment process. Experimental results demonstrate that PFSWNN exhibits superior generalization performance by small-world property as well as the pruning algorithm, and the training time of PFSWNN is shortened owning to a compact structure. •The importance of the hidden nodes is measured using Katz centrality.•The small-worldness property and pruning improve the generalization performance of PFSWNN.•PFSWNN can self-adjust to a compact structure by the pruning algorithm.•The convergence of PFSWNN can be guaranteed.
ISSN:0893-6080
1879-2782
DOI:10.1016/j.neunet.2020.07.017