Testing SOAR Tools in Use
Modern security operation centers (SOCs) rely on operators and a tapestry of logging and alerting tools with large scale collection and query abilities. SOC investigations are tedious as they rely on manual efforts to query diverse data sources, overlay related logs, and correlate the data into info...
Gespeichert in:
Veröffentlicht in: | arXiv.org 2023-02 |
---|---|
Hauptverfasser: | , , , , , , , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | arXiv.org |
container_volume | |
creator | Bridges, Robert A Rice, Ashley E Oesch, Sean Nichols, Jeff A Watson, Cory Spakes, Kevin Norem, Savannah Huettel, Mike Jewell, Brian Weber, Brian Gannon, Connor Bizovi, Olivia Hollifield, Samuel C Erwin, Samantha |
description | Modern security operation centers (SOCs) rely on operators and a tapestry of logging and alerting tools with large scale collection and query abilities. SOC investigations are tedious as they rely on manual efforts to query diverse data sources, overlay related logs, and correlate the data into information and then document results in a ticketing system. Security orchestration, automation, and response (SOAR) tools are a new technology that promise to collect, filter, and display needed data; automate common tasks that require SOC analysts' time; facilitate SOC collaboration; and, improve both efficiency and consistency of SOCs. SOAR tools have never been tested in practice to evaluate their effect and understand them in use. In this paper, we design and administer the first hands-on user study of SOAR tools, involving 24 participants and 6 commercial SOAR tools. Our contributions include the experimental design, itemizing six characteristics of SOAR tools and a methodology for testing them. We describe configuration of the test environment in a cyber range, including network, user, and threat emulation; a full SOC tool suite; and creation of artifacts allowing multiple representative investigation scenarios to permit testing. We present the first research results on SOAR tools. We found that SOAR configuration is critical, as it involves creative design for data display and automation. We found that SOAR tools increased efficiency and reduced context switching during investigations, although ticket accuracy and completeness (indicating investigation quality) decreased with SOAR use. Our findings indicated that user preferences are slightly negatively correlated with their performance with the tool; overautomation was a concern of senior analysts, and SOAR tools that balanced automation with assisting a user to make decisions were preferred. |
doi_str_mv | 10.48550/arxiv.2208.06075 |
format | Article |
fullrecord | <record><control><sourceid>proquest_arxiv</sourceid><recordid>TN_cdi_arxiv_primary_2208_06075</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2702345710</sourcerecordid><originalsourceid>FETCH-LOGICAL-a955-daade574e3cf08e6714186cc753c2a631219bb1cdd74dbf2e72ec5656691f7653</originalsourceid><addsrcrecordid>eNotzl1LwzAUxvEgCBtzH2BXK3jdenKSk6SXY_gyGAy0Xoc0SaVjtrNxot_eunn13Px5-DG24FBIQwR3bvhuvwpEMAUo0HTFpigEz41EnLB5SnsAQKWRSEzZoorps-3espfd6jmr-v6QsrbLXlO8YdeNO6Q4_98Zqx7uq_VTvt09btarbe5Kojw4FyJpGYVvwESlueRGea9JeHRKcORlXXMfgpahbjBqjJ4UKVXyRisSM7a83J7d9ji07274sX9-e_aPxe2lOA79x2nk2n1_GrrRZFEDCkmag_gFFgFFWg</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2702345710</pqid></control><display><type>article</type><title>Testing SOAR Tools in Use</title><source>arXiv.org</source><source>Free E- Journals</source><creator>Bridges, Robert A ; Rice, Ashley E ; Oesch, Sean ; Nichols, Jeff A ; Watson, Cory ; Spakes, Kevin ; Norem, Savannah ; Huettel, Mike ; Jewell, Brian ; Weber, Brian ; Gannon, Connor ; Bizovi, Olivia ; Hollifield, Samuel C ; Erwin, Samantha</creator><creatorcontrib>Bridges, Robert A ; Rice, Ashley E ; Oesch, Sean ; Nichols, Jeff A ; Watson, Cory ; Spakes, Kevin ; Norem, Savannah ; Huettel, Mike ; Jewell, Brian ; Weber, Brian ; Gannon, Connor ; Bizovi, Olivia ; Hollifield, Samuel C ; Erwin, Samantha</creatorcontrib><description>Modern security operation centers (SOCs) rely on operators and a tapestry of logging and alerting tools with large scale collection and query abilities. SOC investigations are tedious as they rely on manual efforts to query diverse data sources, overlay related logs, and correlate the data into information and then document results in a ticketing system. Security orchestration, automation, and response (SOAR) tools are a new technology that promise to collect, filter, and display needed data; automate common tasks that require SOC analysts' time; facilitate SOC collaboration; and, improve both efficiency and consistency of SOCs. SOAR tools have never been tested in practice to evaluate their effect and understand them in use. In this paper, we design and administer the first hands-on user study of SOAR tools, involving 24 participants and 6 commercial SOAR tools. Our contributions include the experimental design, itemizing six characteristics of SOAR tools and a methodology for testing them. We describe configuration of the test environment in a cyber range, including network, user, and threat emulation; a full SOC tool suite; and creation of artifacts allowing multiple representative investigation scenarios to permit testing. We present the first research results on SOAR tools. We found that SOAR configuration is critical, as it involves creative design for data display and automation. We found that SOAR tools increased efficiency and reduced context switching during investigations, although ticket accuracy and completeness (indicating investigation quality) decreased with SOAR use. Our findings indicated that user preferences are slightly negatively correlated with their performance with the tool; overautomation was a concern of senior analysts, and SOAR tools that balanced automation with assisting a user to make decisions were preferred.</description><identifier>EISSN: 2331-8422</identifier><identifier>DOI: 10.48550/arxiv.2208.06075</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Automation ; Computer Science - Cryptography and Security ; Configurations ; Design of experiments ; Displays ; New technology ; Security</subject><ispartof>arXiv.org, 2023-02</ispartof><rights>2023. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,776,780,881,27904</link.rule.ids><backlink>$$Uhttps://doi.org/10.1016/j.cose.2023.103201$$DView published paper (Access to full text may be restricted)$$Hfree_for_read</backlink><backlink>$$Uhttps://doi.org/10.48550/arXiv.2208.06075$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Bridges, Robert A</creatorcontrib><creatorcontrib>Rice, Ashley E</creatorcontrib><creatorcontrib>Oesch, Sean</creatorcontrib><creatorcontrib>Nichols, Jeff A</creatorcontrib><creatorcontrib>Watson, Cory</creatorcontrib><creatorcontrib>Spakes, Kevin</creatorcontrib><creatorcontrib>Norem, Savannah</creatorcontrib><creatorcontrib>Huettel, Mike</creatorcontrib><creatorcontrib>Jewell, Brian</creatorcontrib><creatorcontrib>Weber, Brian</creatorcontrib><creatorcontrib>Gannon, Connor</creatorcontrib><creatorcontrib>Bizovi, Olivia</creatorcontrib><creatorcontrib>Hollifield, Samuel C</creatorcontrib><creatorcontrib>Erwin, Samantha</creatorcontrib><title>Testing SOAR Tools in Use</title><title>arXiv.org</title><description>Modern security operation centers (SOCs) rely on operators and a tapestry of logging and alerting tools with large scale collection and query abilities. SOC investigations are tedious as they rely on manual efforts to query diverse data sources, overlay related logs, and correlate the data into information and then document results in a ticketing system. Security orchestration, automation, and response (SOAR) tools are a new technology that promise to collect, filter, and display needed data; automate common tasks that require SOC analysts' time; facilitate SOC collaboration; and, improve both efficiency and consistency of SOCs. SOAR tools have never been tested in practice to evaluate their effect and understand them in use. In this paper, we design and administer the first hands-on user study of SOAR tools, involving 24 participants and 6 commercial SOAR tools. Our contributions include the experimental design, itemizing six characteristics of SOAR tools and a methodology for testing them. We describe configuration of the test environment in a cyber range, including network, user, and threat emulation; a full SOC tool suite; and creation of artifacts allowing multiple representative investigation scenarios to permit testing. We present the first research results on SOAR tools. We found that SOAR configuration is critical, as it involves creative design for data display and automation. We found that SOAR tools increased efficiency and reduced context switching during investigations, although ticket accuracy and completeness (indicating investigation quality) decreased with SOAR use. Our findings indicated that user preferences are slightly negatively correlated with their performance with the tool; overautomation was a concern of senior analysts, and SOAR tools that balanced automation with assisting a user to make decisions were preferred.</description><subject>Automation</subject><subject>Computer Science - Cryptography and Security</subject><subject>Configurations</subject><subject>Design of experiments</subject><subject>Displays</subject><subject>New technology</subject><subject>Security</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GOX</sourceid><recordid>eNotzl1LwzAUxvEgCBtzH2BXK3jdenKSk6SXY_gyGAy0Xoc0SaVjtrNxot_eunn13Px5-DG24FBIQwR3bvhuvwpEMAUo0HTFpigEz41EnLB5SnsAQKWRSEzZoorps-3espfd6jmr-v6QsrbLXlO8YdeNO6Q4_98Zqx7uq_VTvt09btarbe5Kojw4FyJpGYVvwESlueRGea9JeHRKcORlXXMfgpahbjBqjJ4UKVXyRisSM7a83J7d9ji07274sX9-e_aPxe2lOA79x2nk2n1_GrrRZFEDCkmag_gFFgFFWg</recordid><startdate>20230214</startdate><enddate>20230214</enddate><creator>Bridges, Robert A</creator><creator>Rice, Ashley E</creator><creator>Oesch, Sean</creator><creator>Nichols, Jeff A</creator><creator>Watson, Cory</creator><creator>Spakes, Kevin</creator><creator>Norem, Savannah</creator><creator>Huettel, Mike</creator><creator>Jewell, Brian</creator><creator>Weber, Brian</creator><creator>Gannon, Connor</creator><creator>Bizovi, Olivia</creator><creator>Hollifield, Samuel C</creator><creator>Erwin, Samantha</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20230214</creationdate><title>Testing SOAR Tools in Use</title><author>Bridges, Robert A ; Rice, Ashley E ; Oesch, Sean ; Nichols, Jeff A ; Watson, Cory ; Spakes, Kevin ; Norem, Savannah ; Huettel, Mike ; Jewell, Brian ; Weber, Brian ; Gannon, Connor ; Bizovi, Olivia ; Hollifield, Samuel C ; Erwin, Samantha</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a955-daade574e3cf08e6714186cc753c2a631219bb1cdd74dbf2e72ec5656691f7653</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Automation</topic><topic>Computer Science - Cryptography and Security</topic><topic>Configurations</topic><topic>Design of experiments</topic><topic>Displays</topic><topic>New technology</topic><topic>Security</topic><toplevel>online_resources</toplevel><creatorcontrib>Bridges, Robert A</creatorcontrib><creatorcontrib>Rice, Ashley E</creatorcontrib><creatorcontrib>Oesch, Sean</creatorcontrib><creatorcontrib>Nichols, Jeff A</creatorcontrib><creatorcontrib>Watson, Cory</creatorcontrib><creatorcontrib>Spakes, Kevin</creatorcontrib><creatorcontrib>Norem, Savannah</creatorcontrib><creatorcontrib>Huettel, Mike</creatorcontrib><creatorcontrib>Jewell, Brian</creatorcontrib><creatorcontrib>Weber, Brian</creatorcontrib><creatorcontrib>Gannon, Connor</creatorcontrib><creatorcontrib>Bizovi, Olivia</creatorcontrib><creatorcontrib>Hollifield, Samuel C</creatorcontrib><creatorcontrib>Erwin, Samantha</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection><collection>arXiv Computer Science</collection><collection>arXiv.org</collection><jtitle>arXiv.org</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Bridges, Robert A</au><au>Rice, Ashley E</au><au>Oesch, Sean</au><au>Nichols, Jeff A</au><au>Watson, Cory</au><au>Spakes, Kevin</au><au>Norem, Savannah</au><au>Huettel, Mike</au><au>Jewell, Brian</au><au>Weber, Brian</au><au>Gannon, Connor</au><au>Bizovi, Olivia</au><au>Hollifield, Samuel C</au><au>Erwin, Samantha</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Testing SOAR Tools in Use</atitle><jtitle>arXiv.org</jtitle><date>2023-02-14</date><risdate>2023</risdate><eissn>2331-8422</eissn><abstract>Modern security operation centers (SOCs) rely on operators and a tapestry of logging and alerting tools with large scale collection and query abilities. SOC investigations are tedious as they rely on manual efforts to query diverse data sources, overlay related logs, and correlate the data into information and then document results in a ticketing system. Security orchestration, automation, and response (SOAR) tools are a new technology that promise to collect, filter, and display needed data; automate common tasks that require SOC analysts' time; facilitate SOC collaboration; and, improve both efficiency and consistency of SOCs. SOAR tools have never been tested in practice to evaluate their effect and understand them in use. In this paper, we design and administer the first hands-on user study of SOAR tools, involving 24 participants and 6 commercial SOAR tools. Our contributions include the experimental design, itemizing six characteristics of SOAR tools and a methodology for testing them. We describe configuration of the test environment in a cyber range, including network, user, and threat emulation; a full SOC tool suite; and creation of artifacts allowing multiple representative investigation scenarios to permit testing. We present the first research results on SOAR tools. We found that SOAR configuration is critical, as it involves creative design for data display and automation. We found that SOAR tools increased efficiency and reduced context switching during investigations, although ticket accuracy and completeness (indicating investigation quality) decreased with SOAR use. Our findings indicated that user preferences are slightly negatively correlated with their performance with the tool; overautomation was a concern of senior analysts, and SOAR tools that balanced automation with assisting a user to make decisions were preferred.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><doi>10.48550/arxiv.2208.06075</doi><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | EISSN: 2331-8422 |
ispartof | arXiv.org, 2023-02 |
issn | 2331-8422 |
language | eng |
recordid | cdi_arxiv_primary_2208_06075 |
source | arXiv.org; Free E- Journals |
subjects | Automation Computer Science - Cryptography and Security Configurations Design of experiments Displays New technology Security |
title | Testing SOAR Tools in Use |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-25T10%3A26%3A48IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_arxiv&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Testing%20SOAR%20Tools%20in%20Use&rft.jtitle=arXiv.org&rft.au=Bridges,%20Robert%20A&rft.date=2023-02-14&rft.eissn=2331-8422&rft_id=info:doi/10.48550/arxiv.2208.06075&rft_dat=%3Cproquest_arxiv%3E2702345710%3C/proquest_arxiv%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2702345710&rft_id=info:pmid/&rfr_iscdi=true |