Detecting Anomalies in System Logs With a Compact Convolutional Transformer
Computer systems play an important role to ensure the correct functioning of critical systems such as train stations, power stations, emergency systems, and server infrastructures. To ensure the correct functioning and safety of these computer systems, the detection of abnormal system behavior is cr...
Gespeichert in:
Veröffentlicht in: | IEEE access 2023, Vol.11, p.113464-113479 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 113479 |
---|---|
container_issue | |
container_start_page | 113464 |
container_title | IEEE access |
container_volume | 11 |
creator | Larisch, Rene Vitay, Julien Hamker, Fred H. |
description | Computer systems play an important role to ensure the correct functioning of critical systems such as train stations, power stations, emergency systems, and server infrastructures. To ensure the correct functioning and safety of these computer systems, the detection of abnormal system behavior is crucial. For that purpose, monitoring log data (mirroring the recent and current system status) is very commonly used. Because log data consists mainly of words and numbers, recent work used Transformer-based networks to analyze the log data and predict anomalies. Despite their success in fields such as natural language processing and computer vision, the main disadvantage of Transformers is the huge amount of trainable parameters, leading to long training times. In this work, we use a Compact Convolutional Transformer to detect anomalies in log data. Using convolutional layers leads to a much smaller number of trainable parameters and enable the processing of many consecutive log lines. We evaluate the proposed network on two standard datasets for log data anomaly detection, Blue Gene/L (BGL) and Spirit. Our results demonstrate that the combination of convolutional processing and self-attention improves the performance for anomaly detection in comparison to other self-supervised Transformer-based approaches, and is even on par with supervised approaches. |
doi_str_mv | 10.1109/ACCESS.2023.3323252 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1109_ACCESS_2023_3323252</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10285328</ieee_id><doaj_id>oai_doaj_org_article_ecd2a890aaed4831852ecca75aac72d0</doaj_id><sourcerecordid>2879385857</sourcerecordid><originalsourceid>FETCH-LOGICAL-c409t-12293e58d4beee081cff7b5fbbfe89d9c9b60df6ddbdd77c1ea1161f79f2c40c3</originalsourceid><addsrcrecordid>eNpNUU1rGzEQXUILDWl-QXpY6NmuPqyVdDTbNA019OCUHMWsNHJkdleuJBfy76t0Q8lc3vCY92aG1zQ3lKwpJfrLtu9v9_s1I4yvOWecCXbRXDLa6RUXvHv3pv_QXOd8JLVUpYS8bH58xYK2hPnQbuc4wRgwt2Fu98-54NTu4iG3j6E8tdD2cTqBLRXnP3E8lxBnGNuHBHP2MU2YPjbvPYwZr1_xqvn17fah_77a_by777e7ld0QXVaUMc1RKLcZELFeYr2Xg_DD4FFpp60eOuJ859zgnJSWIlDaUS-1Z9XB8qvmfvF1EY7mlMIE6dlECOYfEdPBQCrBjmjQOgZKEwB0G8WpEgytBSkArGSOVK_Pi9cpxd9nzMUc4znVx7JhSmquhBKyTvFlyqaYc0L_fysl5iUEs4RgXkIwryFU1adFFeqfbxRMCc4U_wuNbYRs</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2879385857</pqid></control><display><type>article</type><title>Detecting Anomalies in System Logs With a Compact Convolutional Transformer</title><source>IEEE Open Access Journals</source><source>DOAJ Directory of Open Access Journals</source><source>EZB-FREE-00999 freely available EZB journals</source><creator>Larisch, Rene ; Vitay, Julien ; Hamker, Fred H.</creator><creatorcontrib>Larisch, Rene ; Vitay, Julien ; Hamker, Fred H.</creatorcontrib><description>Computer systems play an important role to ensure the correct functioning of critical systems such as train stations, power stations, emergency systems, and server infrastructures. To ensure the correct functioning and safety of these computer systems, the detection of abnormal system behavior is crucial. For that purpose, monitoring log data (mirroring the recent and current system status) is very commonly used. Because log data consists mainly of words and numbers, recent work used Transformer-based networks to analyze the log data and predict anomalies. Despite their success in fields such as natural language processing and computer vision, the main disadvantage of Transformers is the huge amount of trainable parameters, leading to long training times. In this work, we use a Compact Convolutional Transformer to detect anomalies in log data. Using convolutional layers leads to a much smaller number of trainable parameters and enable the processing of many consecutive log lines. We evaluate the proposed network on two standard datasets for log data anomaly detection, Blue Gene/L (BGL) and Spirit. Our results demonstrate that the combination of convolutional processing and self-attention improves the performance for anomaly detection in comparison to other self-supervised Transformer-based approaches, and is even on par with supervised approaches.</description><identifier>ISSN: 2169-3536</identifier><identifier>EISSN: 2169-3536</identifier><identifier>DOI: 10.1109/ACCESS.2023.3323252</identifier><identifier>CODEN: IAECCG</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Anomalies ; Anomaly detection ; Computer vision ; Convolutional codes ; Deep learning ; Kernel ; Natural language processing ; Neural networks ; Parameters ; Power plants ; Railway stations ; Self-supervised learning ; Supercomputers ; transformer ; Transformers</subject><ispartof>IEEE access, 2023, Vol.11, p.113464-113479</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c409t-12293e58d4beee081cff7b5fbbfe89d9c9b60df6ddbdd77c1ea1161f79f2c40c3</citedby><cites>FETCH-LOGICAL-c409t-12293e58d4beee081cff7b5fbbfe89d9c9b60df6ddbdd77c1ea1161f79f2c40c3</cites><orcidid>0000-0003-3544-0631 ; 0000-0001-5229-2349 ; 0000-0001-9104-7143</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10285328$$EHTML$$P50$$Gieee$$Hfree_for_read</linktohtml><link.rule.ids>314,780,784,864,2102,4024,27633,27923,27924,27925,54933</link.rule.ids></links><search><creatorcontrib>Larisch, Rene</creatorcontrib><creatorcontrib>Vitay, Julien</creatorcontrib><creatorcontrib>Hamker, Fred H.</creatorcontrib><title>Detecting Anomalies in System Logs With a Compact Convolutional Transformer</title><title>IEEE access</title><addtitle>Access</addtitle><description>Computer systems play an important role to ensure the correct functioning of critical systems such as train stations, power stations, emergency systems, and server infrastructures. To ensure the correct functioning and safety of these computer systems, the detection of abnormal system behavior is crucial. For that purpose, monitoring log data (mirroring the recent and current system status) is very commonly used. Because log data consists mainly of words and numbers, recent work used Transformer-based networks to analyze the log data and predict anomalies. Despite their success in fields such as natural language processing and computer vision, the main disadvantage of Transformers is the huge amount of trainable parameters, leading to long training times. In this work, we use a Compact Convolutional Transformer to detect anomalies in log data. Using convolutional layers leads to a much smaller number of trainable parameters and enable the processing of many consecutive log lines. We evaluate the proposed network on two standard datasets for log data anomaly detection, Blue Gene/L (BGL) and Spirit. Our results demonstrate that the combination of convolutional processing and self-attention improves the performance for anomaly detection in comparison to other self-supervised Transformer-based approaches, and is even on par with supervised approaches.</description><subject>Anomalies</subject><subject>Anomaly detection</subject><subject>Computer vision</subject><subject>Convolutional codes</subject><subject>Deep learning</subject><subject>Kernel</subject><subject>Natural language processing</subject><subject>Neural networks</subject><subject>Parameters</subject><subject>Power plants</subject><subject>Railway stations</subject><subject>Self-supervised learning</subject><subject>Supercomputers</subject><subject>transformer</subject><subject>Transformers</subject><issn>2169-3536</issn><issn>2169-3536</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>ESBDL</sourceid><sourceid>RIE</sourceid><sourceid>DOA</sourceid><recordid>eNpNUU1rGzEQXUILDWl-QXpY6NmuPqyVdDTbNA019OCUHMWsNHJkdleuJBfy76t0Q8lc3vCY92aG1zQ3lKwpJfrLtu9v9_s1I4yvOWecCXbRXDLa6RUXvHv3pv_QXOd8JLVUpYS8bH58xYK2hPnQbuc4wRgwt2Fu98-54NTu4iG3j6E8tdD2cTqBLRXnP3E8lxBnGNuHBHP2MU2YPjbvPYwZr1_xqvn17fah_77a_by777e7ld0QXVaUMc1RKLcZELFeYr2Xg_DD4FFpp60eOuJ859zgnJSWIlDaUS-1Z9XB8qvmfvF1EY7mlMIE6dlECOYfEdPBQCrBjmjQOgZKEwB0G8WpEgytBSkArGSOVK_Pi9cpxd9nzMUc4znVx7JhSmquhBKyTvFlyqaYc0L_fysl5iUEs4RgXkIwryFU1adFFeqfbxRMCc4U_wuNbYRs</recordid><startdate>2023</startdate><enddate>2023</enddate><creator>Larisch, Rene</creator><creator>Vitay, Julien</creator><creator>Hamker, Fred H.</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>ESBDL</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7SR</scope><scope>8BQ</scope><scope>8FD</scope><scope>JG9</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0003-3544-0631</orcidid><orcidid>https://orcid.org/0000-0001-5229-2349</orcidid><orcidid>https://orcid.org/0000-0001-9104-7143</orcidid></search><sort><creationdate>2023</creationdate><title>Detecting Anomalies in System Logs With a Compact Convolutional Transformer</title><author>Larisch, Rene ; Vitay, Julien ; Hamker, Fred H.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c409t-12293e58d4beee081cff7b5fbbfe89d9c9b60df6ddbdd77c1ea1161f79f2c40c3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Anomalies</topic><topic>Anomaly detection</topic><topic>Computer vision</topic><topic>Convolutional codes</topic><topic>Deep learning</topic><topic>Kernel</topic><topic>Natural language processing</topic><topic>Neural networks</topic><topic>Parameters</topic><topic>Power plants</topic><topic>Railway stations</topic><topic>Self-supervised learning</topic><topic>Supercomputers</topic><topic>transformer</topic><topic>Transformers</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Larisch, Rene</creatorcontrib><creatorcontrib>Vitay, Julien</creatorcontrib><creatorcontrib>Hamker, Fred H.</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE Open Access Journals</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>IEEE access</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Larisch, Rene</au><au>Vitay, Julien</au><au>Hamker, Fred H.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Detecting Anomalies in System Logs With a Compact Convolutional Transformer</atitle><jtitle>IEEE access</jtitle><stitle>Access</stitle><date>2023</date><risdate>2023</risdate><volume>11</volume><spage>113464</spage><epage>113479</epage><pages>113464-113479</pages><issn>2169-3536</issn><eissn>2169-3536</eissn><coden>IAECCG</coden><abstract>Computer systems play an important role to ensure the correct functioning of critical systems such as train stations, power stations, emergency systems, and server infrastructures. To ensure the correct functioning and safety of these computer systems, the detection of abnormal system behavior is crucial. For that purpose, monitoring log data (mirroring the recent and current system status) is very commonly used. Because log data consists mainly of words and numbers, recent work used Transformer-based networks to analyze the log data and predict anomalies. Despite their success in fields such as natural language processing and computer vision, the main disadvantage of Transformers is the huge amount of trainable parameters, leading to long training times. In this work, we use a Compact Convolutional Transformer to detect anomalies in log data. Using convolutional layers leads to a much smaller number of trainable parameters and enable the processing of many consecutive log lines. We evaluate the proposed network on two standard datasets for log data anomaly detection, Blue Gene/L (BGL) and Spirit. Our results demonstrate that the combination of convolutional processing and self-attention improves the performance for anomaly detection in comparison to other self-supervised Transformer-based approaches, and is even on par with supervised approaches.</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/ACCESS.2023.3323252</doi><tpages>16</tpages><orcidid>https://orcid.org/0000-0003-3544-0631</orcidid><orcidid>https://orcid.org/0000-0001-5229-2349</orcidid><orcidid>https://orcid.org/0000-0001-9104-7143</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 2169-3536 |
ispartof | IEEE access, 2023, Vol.11, p.113464-113479 |
issn | 2169-3536 2169-3536 |
language | eng |
recordid | cdi_crossref_primary_10_1109_ACCESS_2023_3323252 |
source | IEEE Open Access Journals; DOAJ Directory of Open Access Journals; EZB-FREE-00999 freely available EZB journals |
subjects | Anomalies Anomaly detection Computer vision Convolutional codes Deep learning Kernel Natural language processing Neural networks Parameters Power plants Railway stations Self-supervised learning Supercomputers transformer Transformers |
title | Detecting Anomalies in System Logs With a Compact Convolutional Transformer |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-23T13%3A30%3A06IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Detecting%20Anomalies%20in%20System%20Logs%20With%20a%20Compact%20Convolutional%20Transformer&rft.jtitle=IEEE%20access&rft.au=Larisch,%20Rene&rft.date=2023&rft.volume=11&rft.spage=113464&rft.epage=113479&rft.pages=113464-113479&rft.issn=2169-3536&rft.eissn=2169-3536&rft.coden=IAECCG&rft_id=info:doi/10.1109/ACCESS.2023.3323252&rft_dat=%3Cproquest_cross%3E2879385857%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2879385857&rft_id=info:pmid/&rft_ieee_id=10285328&rft_doaj_id=oai_doaj_org_article_ecd2a890aaed4831852ecca75aac72d0&rfr_iscdi=true |