A Robust Transform-Domain Deep Convolutional Network for Voltage Dip Classification

This paper proposes a novel method for voltage dip classification using deep convolutional neural networks. The main contributions of this paper include: 1) to propose a new effective deep convolutional neural network architecture for automatically learning voltage dip features, rather than extracti...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on power delivery 2018-12, Vol.33 (6), p.2794-2802
Hauptverfasser: Bagheri, Azam, Gu, Irene Y. H., Bollen, Math H. J., Balouji, Ebrahim
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This paper proposes a novel method for voltage dip classification using deep convolutional neural networks. The main contributions of this paper include: 1) to propose a new effective deep convolutional neural network architecture for automatically learning voltage dip features, rather than extracting hand-crafted features; 2) to employ the deep learning in an effective two-dimensional (2-D) transform domain, under space-phasor model (SPM), for efficient learning of dip features; 3) to characterize voltage dips by 2-D SPM-based deep learning, which leads to voltage dip features independent of the duration and sampling frequency of dip recordings; and 4) to develop robust automatically-extracted features that are insensitive to training and test datasets measured from different countries/regions. Experiments were conducted on datasets containing about 6000 measured voltage dips spread over seven classes measured from several different countries. Results have shown good performance of the proposed method: average classification rate is about 97% and false alarm rate is about 0.50%. The test results from the proposed method are compared with the results from two existing dip classification methods. The proposed method is shown to outperform these existing methods.
ISSN:0885-8977
1937-4208
1937-4208
DOI:10.1109/TPWRD.2018.2854677