Improving Log-Based Anomaly Detection by Pre-Training Hierarchical Transformers
Pre-trained models, such as BERT, have resulted in significant pre-trained models, such as BERT, have resulted in significant improvements in many natural language processing (NLP) applications. However, due to differences in word distribution and domain data distribution, applying NLP advancements...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on computers 2023-09, Vol.72 (9), p.1-13 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Pre-trained models, such as BERT, have resulted in significant pre-trained models, such as BERT, have resulted in significant improvements in many natural language processing (NLP) applications. However, due to differences in word distribution and domain data distribution, applying NLP advancements to log analysis directly faces some performance challenges. This paper studies how to adapt the recently introduced pre-trained language model BERT for log analysis. In this work, we propose a pre-trained log representation model with hierarchical bidirectional encoder transformers (namely, HilBERT). Unlike previous work, which used raw text as pre-training data, we parse logs into templates before using the log templates to pre-train HilBERT.We also design a hierarchical transformers model to capture log template sequence-level information. We use log-based anomaly detection for downstream tasks and fine-tune our model with different log data. Our experiments demonstrate that HilBERT outperforms other baseline techniques on unstable log data. While BERT obtains performance comparable to that of previous state-of-the-art models, HilBERT can significantly address the problem of log instability and achieve accurate and robust results. |
---|---|
ISSN: | 0018-9340 1557-9956 |
DOI: | 10.1109/TC.2023.3257518 |