Object recognition for power equipment via human‐level concept learning

Inspection robots are popularized in substations due to the lack of personnel for operation and maintenance. However, these inspection robots remain at the level of perceptual intelligence, rather than cognition intelligence. To enable a robot to automatically detect defects of power equipment, obje...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IET generation, transmission & distribution transmission & distribution, 2021-05, Vol.15 (10), p.1578-1587
Hauptverfasser: Xiong, Siheng, Liu, Yadong, Yan, Yingjie, Pei, Ling, Xu, Peng, Fu, Xiaofei, Jiang, Xiuchen
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Inspection robots are popularized in substations due to the lack of personnel for operation and maintenance. However, these inspection robots remain at the level of perceptual intelligence, rather than cognition intelligence. To enable a robot to automatically detect defects of power equipment, object recognition is a critical step because criteria of infrared diagnosis vary with types of equipment. Since this task is not a big‐sample learning problem, prior knowledge needs to be added to improve existing methods. Here, an object recognition model based on human‐level concept learning is proposed, which utilizes relationship between equipment. The proposed method is composed of three parts: Mask RCNN, Bayesian Context Network and human‐level concept learning. As the backbone network, Mask RCNN, a pixel‐wise segmentation network, gives preliminary recognition results. Then, based on the object relationship graph of Bayesian Context Network, human‐level concept learning corrects the results in sequence by maximizing the conditional probability of an object given its neighbourhood. Experiments show that the accuracy of the proposed method increases 9.7% compared with its backbone network, making industrial application of this technology possible.
ISSN:1751-8687
1751-8695
DOI:10.1049/gtd2.12088