Rolling Bearing Fault Diagnosis Based on Time-Frequency Compression Fusion and Residual Time-Frequency Mixed Attention Network
The traditional rolling bearing diagnosis algorithms have problems such as insufficient information on time-frequency images and poor feature extraction ability of the diagnosis model. These problems limit the improvement of diagnosis performance. In this article, the input of the time-frequency ima...
Gespeichert in:
Veröffentlicht in: | Applied sciences 2022-05, Vol.12 (10), p.4831 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The traditional rolling bearing diagnosis algorithms have problems such as insufficient information on time-frequency images and poor feature extraction ability of the diagnosis model. These problems limit the improvement of diagnosis performance. In this article, the input of the time-frequency image and intelligent diagnosis algorithms are optimized. Firstly, the characteristics of two advanced time-frequency analysis algorithms are deeply analyzed, i.e., multisynchrosqueezing transform (MSST) and time-reassigned multisynchrosqueezing transform (TMSST). Then, we propose time-frequency compression fusion (TFCF) and a residual time-frequency mixed attention network (RTFANet). Among them, TFCF superposes and splices two time-frequency images to form dual-channel images, which can fully play the characteristics of multi-channel feature fusion of the convolutional kernel in the convolutional neural network. RTFANet assigns attention weight to the channels, time and frequency of time-frequency images, making the model pay attention to crucial time-frequency information. Meanwhile, the residual connection is introduced in the process of attention weight distribution to reduce the information loss of feature mapping. Experimental results show that the method converges after seven epochs, with a fast convergence rate and a recognition rate of 99.86%. Compared with other methods, the proposed method has better robustness and precision. |
---|---|
ISSN: | 2076-3417 2076-3417 |
DOI: | 10.3390/app12104831 |