Efficient and interpretable SRU combined with TabNet for network intrusion detection in the big data environment

While digital application infrastructure services are becoming increasingly abundant and the scale of the network continues to expand, many new network vulnerabilities and attacks (such as DoS, Botnet, and MITM) have emerged in an endless stream. The timely and accurate detection of network anomalie...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International journal of information security 2023-06, Vol.22 (3), p.679-689
Hauptverfasser: Chen, Yingchun, Li, Jinguo, Guo, Naiwang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:While digital application infrastructure services are becoming increasingly abundant and the scale of the network continues to expand, many new network vulnerabilities and attacks (such as DoS, Botnet, and MITM) have emerged in an endless stream. The timely and accurate detection of network anomalies is of extraordinary importance for the stability of the network. Previous works designed based on deep learning have faced difficulties in their adoption in practice due to the lack of interpretability. Recently, Recurrent Neural Networks perform a superior ability to analyze high-dimensional complex network flow. However, these methods have the problems of limited parallelizability and time-consuming training, so they cannot meet the particular requirements of intrusion detection. To solve the above issues, we propose an efficient and interpretable intrusion detection scheme based on simple recurrent networks (Tab-AttSRU) to identify abnormal network traffic patterns accurately. Concretely, to obtain high-quality interpretation, we utilize model-specific feature importance and a learnable mask of TabNet for soft selection. The sequential attention mechanism is used to select the decision-making features for necessary interpretability. To realize efficient parallel computing, we combine SRU with attention mechanism to capture latent connections between traffic at different times and implement it on Spark. The performance of proposed method is assessed on the benchmark UNSW-NB15 and a real-world dataset UKM-IDS20. Experimental results have demonstrated the efficiency and interpretability of proposed method.
ISSN:1615-5262
1615-5270
DOI:10.1007/s10207-022-00656-w