An intelligent ensemble of long‐short‐term memory with genetic algorithm for network anomaly identification

Cyberattacks are increasing rapidly with rapid Internet advancement and, the cybersecurity situation is not optimistic. Anomaly detection is one of the challenging sectors of network security, which shows a significant role in any organization. Many anomaly detection systems identify malicious activ...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Transactions on emerging telecommunications technologies 2022-10, Vol.33 (10), p.n/a
Hauptverfasser: Thaseen, I. Sumaiya, Chitturi, Arun Krishna, Al‐Turjman, Fadi, Shankar, Achyut, Ghalib, Muhammad Rukunuddin, Abhishek, Kumar
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Cyberattacks are increasing rapidly with rapid Internet advancement and, the cybersecurity situation is not optimistic. Anomaly detection is one of the challenging sectors of network security, which shows a significant role in any organization. Many anomaly detection systems identify malicious activities by deploying machine learning and deep learning techniques. The major contribution of this research is to develop an anomaly detection model for networks using a homogenous ensemble of Long‐Short‐Term‐Memory integrated with Genetic Algorithm (GA) utilized for feature extraction. An extensive literature on anomaly detection, which utilizes deep learning algorithms, is studied. NSL‐KDD and UNSW‐NB datasets are deployed for evaluating the proposed network anomaly model. The experimental analysis shows that the proposed ensemble is superior to other ensembles with a maximum accuracy of 99.9% and a minimum false alarm rate of 1.56% on NSL‐KDD dataset and a maximum accuracy of 99.3% is obtained on UNSW‐NB15 dataset with false alarm rate of 1.7%. Hence, the proposed model performs fair on both the datasets. An anomaly identification model is developed using an integration of Genetic Algorithm (GA) for feature extraction and Long Short Term Memory (LSTM) deep learning model for classification. Feature selection using GA results in an optimal subset for enhancing the classification performance of LSTM. The proposed model has been evaluated with benchmark datasets, namely, NSL‐KDD and recent UNSW‐NB15. The model exhibits maximum performance and a comparison with various ensembles is also performed on different metrics to show the supremacy of the model.
ISSN:2161-3915
2161-3915
DOI:10.1002/ett.4149