A New Parameter Repurposing Method for Parameter Transfer With Small Dataset and Its Application in Fault Diagnosis of Rolling Element Bearings
Transfer learning is a promising deep learning approach that can be used in applications that suffer from insufficient training data. Parameter transfer, which is a method of improving the accuracy and training speed by training the target network using the parameters of the source network, selects...
Gespeichert in:
Veröffentlicht in: | IEEE access 2019, Vol.7, p.46917-46930 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Transfer learning is a promising deep learning approach that can be used in applications that suffer from insufficient training data. Parameter transfer, which is a method of improving the accuracy and training speed by training the target network using the parameters of the source network, selects two conventional parameter repurposing methods, such as parameter freezing and fine-tuning depending on the amount of target data and network size. In this paper, we propose a novel method to increase the performance in the target domain in an intermediate approach of both methods. The proposed method, selective parameter freezing (SPF), freezes only a portion of parameters not freeze or fine-tunes all parameters within a layer by choosing output-sensitive parameters from the source network. Freezing only sensitive parameters while training is to reduce the amount of trainable parameter and protect informative parameters from overfitting to a small number of target data. Using two sets of the source-target domain, artificial faults with different fault size and artificial faults-natural faults of rolling element bearing, the proposed SPF allows adaptation to the target domain by choosing the best degree of freezing with various amounts of target data and size of networks compared to conventional approaches. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2019.2906273 |