Angel or Devil: Discriminating Hard Samples and Anomaly Contaminations for Unsupervised Time Series Anomaly Detection
Training in unsupervised time series anomaly detection is constantly plagued by the discrimination between harmful `anomaly contaminations' and beneficial `hard normal samples'. These two samples exhibit analogous loss behavior that conventional loss-based methodologies struggle to differe...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Training in unsupervised time series anomaly detection is constantly plagued
by the discrimination between harmful `anomaly contaminations' and beneficial
`hard normal samples'. These two samples exhibit analogous loss behavior that
conventional loss-based methodologies struggle to differentiate. To tackle this
problem, we propose a novel approach that supplements traditional loss behavior
with `parameter behavior', enabling a more granular characterization of
anomalous patterns. Parameter behavior is formalized by measuring the
parametric response to minute perturbations in input samples. Leveraging the
complementary nature of parameter and loss behaviors, we further propose a dual
Parameter-Loss Data Augmentation method (termed PLDA), implemented within the
reinforcement learning paradigm. During the training phase of anomaly
detection, PLDA dynamically augments the training data through an iterative
process that simultaneously mitigates anomaly contaminations while amplifying
informative hard normal samples. PLDA demonstrates remarkable versatility,
which can serve as an additional component that seamlessly integrated with
existing anomaly detectors to enhance their detection performance. Extensive
experiments on ten datasets show that PLDA significantly improves the
performance of four distinct detectors by up to 8\%, outperforming three
state-of-the-art data augmentation methods. |
---|---|
DOI: | 10.48550/arxiv.2410.21322 |