Self-Supervised Feature Learning for Appliance Recognition in Nonintrusive Load Monitoring

Nonintrusive load monitoring (NILM) can monitor the operating state and energy consumption of electric appliances in a nonintrusive manner and provides a promising approach to improving electricity usage efficiency for residential and commercial buildings. Although machine learning (ML) methods are...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on industrial informatics 2024-02, Vol.20 (2), p.1698-1710
Hauptverfasser: Liu, Yinyan, Bai, Lei, Ma, Jin, Wang, Wei, Ouyang, Wanli
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Nonintrusive load monitoring (NILM) can monitor the operating state and energy consumption of electric appliances in a nonintrusive manner and provides a promising approach to improving electricity usage efficiency for residential and commercial buildings. Although machine learning (ML) methods are powerful and have significantly advanced the developments of NILM, they request a sizable amount of labeled data for model training. However, getting operational data of each electrical appliance in real life is challenging, so the requirements for labeled data limit the NLIM's practicality. To tackle this challenge, a novel multilayer momentum contrast (MLMoCo) learning mechanism is proposed for self-supervised feature representation learning. With only unlabeled aggregate load data, the proposed MLMoCo contrasts the augmented versions of the same sample ("positives") with instances extracted from other samples ("negatives"). To maintain a dictionary with enough negative samples to be compared with the input, a momentum encoder is adopted to momentum update the parameters rather than by backpropagation during training. An event-based data augmentation method is also proposed to obtain the distinct but strongly related positive pairs for self-supervised feature learning. The experimental comparisons, including different state-of-the-art techniques and various downstream tasks with real-world datasets, demonstrate the remarkable performance gains of the proposed approach through learning from the unlabeled data, which could significantly increase the practicality of the NILM.
ISSN:1551-3203
1941-0050
DOI:10.1109/TII.2023.3280445