Adversarial Attacks on Machine Learning Cybersecurity Defences in Industrial Control Systems
The proliferation and application of machine learning based Intrusion Detection Systems (IDS) have allowed for more flexibility and efficiency in the automated detection of cyber attacks in Industrial Control Systems (ICS). However, the introduction of such IDSs has also created an additional attack...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The proliferation and application of machine learning based Intrusion
Detection Systems (IDS) have allowed for more flexibility and efficiency in the
automated detection of cyber attacks in Industrial Control Systems (ICS).
However, the introduction of such IDSs has also created an additional attack
vector; the learning models may also be subject to cyber attacks, otherwise
referred to as Adversarial Machine Learning (AML). Such attacks may have severe
consequences in ICS systems, as adversaries could potentially bypass the IDS.
This could lead to delayed attack detection which may result in infrastructure
damages, financial loss, and even loss of life. This paper explores how
adversarial learning can be used to target supervised models by generating
adversarial samples using the Jacobian-based Saliency Map attack and exploring
classification behaviours. The analysis also includes the exploration of how
such samples can support the robustness of supervised models using adversarial
training. An authentic power system dataset was used to support the experiments
presented herein. Overall, the classification performance of two widely used
classifiers, Random Forest and J48, decreased by 16 and 20 percentage points
when adversarial samples were present. Their performances improved following
adversarial training, demonstrating their robustness towards such attacks. |
---|---|
DOI: | 10.48550/arxiv.2004.05005 |