Revisiting Test Impact Analysis in Continuous Testing From the Perspective of Code Dependencies
In continuous testing, developers execute automated test cases once or even several times per day to ensure the quality of the integrated code. Although continuous testing helps ensure the quality of the code and reduces maintenance effort, it also significantly increases test execution overhead. In...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on software engineering 2022-06, Vol.48 (6), p.1-1 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 1 |
---|---|
container_issue | 6 |
container_start_page | 1 |
container_title | IEEE transactions on software engineering |
container_volume | 48 |
creator | Peng, Zi Chen, Tse-Hsun Yang, Jinqiu |
description | In continuous testing, developers execute automated test cases once or even several times per day to ensure the quality of the integrated code. Although continuous testing helps ensure the quality of the code and reduces maintenance effort, it also significantly increases test execution overhead. In this paper, we empirically evaluate the effectiveness of test impact analysis from the perspective of code dependencies in the continuous testing setting. We first applied test impact analysis to one year of software development history in 11 large-scale open-source systems. We found that even though the number of changed files is small in daily commits (median ranges from 3 to 28 files), around 50% or more of the test cases are still impacted and need to be executed. Motivated by our finding, we further studied the code dependencies between source code files and test cases, and among test cases. We found that 1) test cases often focus on testing the integrated behaviour of the systems and 15% of the test cases have dependencies with more than 20 source code files; 2) 18\% of the test cases have dependencies with other test cases, and test case inheritance is the most common cause of test case dependencies; and 3) we documented four dependency-related test smells that we uncovered in our manual study. Our study provides the first step towards studying and understanding the effectiveness of test impact analysis in the continuous testing setting and provides insights on improving test design and execution. |
doi_str_mv | 10.1109/TSE.2020.3045914 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_ieee_primary_9303402</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9303402</ieee_id><sourcerecordid>2676781730</sourcerecordid><originalsourceid>FETCH-LOGICAL-c2484-9b160d94f5da609064318f83c4b18793d8e733c9ad0bc51f3a97286de2d283f3</originalsourceid><addsrcrecordid>eNo9kE1Lw0AQhhdRsFbvgpcFz6mzX8nusdRWCwVFc1_SzUS3tEnMJoX-e7e2eJrDPO_LzEPIPYMJY2Ce8s_5hAOHiQCpDJMXZMSMMIlQHC7JCMDoRCltrslNCBsAUFmmRsR-4N4H3_v6i-YYerrctYXr6bQutofgA_U1nTV13A_NEP6QI7romh3tv5G-YxdadL3fI22qiJZIn7HFusTaeQy35KoqtgHvznNM8sU8n70mq7eX5Wy6ShyXWiZmzVIojaxUWaRgIJWC6UoLJ9dMZ0aUGjMhnClKWDvFKlGYjOu0RF5yLSoxJo-n2rZrfoZ4pN00Qxd_CJanWZpplgmIFJwo1zUhdFjZtvO7ojtYBvZo0UaL9mjRni3GyMMp4hHxHzexTAIXv-ZkbWU</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2676781730</pqid></control><display><type>article</type><title>Revisiting Test Impact Analysis in Continuous Testing From the Perspective of Code Dependencies</title><source>IEEE Electronic Library (IEL)</source><creator>Peng, Zi ; Chen, Tse-Hsun ; Yang, Jinqiu</creator><creatorcontrib>Peng, Zi ; Chen, Tse-Hsun ; Yang, Jinqiu</creatorcontrib><description>In continuous testing, developers execute automated test cases once or even several times per day to ensure the quality of the integrated code. Although continuous testing helps ensure the quality of the code and reduces maintenance effort, it also significantly increases test execution overhead. In this paper, we empirically evaluate the effectiveness of test impact analysis from the perspective of code dependencies in the continuous testing setting. We first applied test impact analysis to one year of software development history in 11 large-scale open-source systems. We found that even though the number of changed files is small in daily commits (median ranges from 3 to 28 files), around 50% or more of the test cases are still impacted and need to be executed. Motivated by our finding, we further studied the code dependencies between source code files and test cases, and among test cases. We found that 1) test cases often focus on testing the integrated behaviour of the systems and 15% of the test cases have dependencies with more than 20 source code files; 2) 18\% of the test cases have dependencies with other test cases, and test case inheritance is the most common cause of test case dependencies; and 3) we documented four dependency-related test smells that we uncovered in our manual study. Our study provides the first step towards studying and understanding the effectiveness of test impact analysis in the continuous testing setting and provides insights on improving test design and execution.</description><identifier>ISSN: 0098-5589</identifier><identifier>EISSN: 1939-3520</identifier><identifier>DOI: 10.1109/TSE.2020.3045914</identifier><identifier>CODEN: IESEDJ</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Automation ; Computer bugs ; continuous testing ; empirical study ; Impact analysis ; Maintenance engineering ; Manuals ; Software ; Software development ; Source code ; test impact analysis ; test smells ; Testing</subject><ispartof>IEEE transactions on software engineering, 2022-06, Vol.48 (6), p.1-1</ispartof><rights>Copyright IEEE Computer Society 2022</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c2484-9b160d94f5da609064318f83c4b18793d8e733c9ad0bc51f3a97286de2d283f3</citedby><cites>FETCH-LOGICAL-c2484-9b160d94f5da609064318f83c4b18793d8e733c9ad0bc51f3a97286de2d283f3</cites><orcidid>0000-0003-4027-0905</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9303402$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27901,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/9303402$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Peng, Zi</creatorcontrib><creatorcontrib>Chen, Tse-Hsun</creatorcontrib><creatorcontrib>Yang, Jinqiu</creatorcontrib><title>Revisiting Test Impact Analysis in Continuous Testing From the Perspective of Code Dependencies</title><title>IEEE transactions on software engineering</title><addtitle>TSE</addtitle><description>In continuous testing, developers execute automated test cases once or even several times per day to ensure the quality of the integrated code. Although continuous testing helps ensure the quality of the code and reduces maintenance effort, it also significantly increases test execution overhead. In this paper, we empirically evaluate the effectiveness of test impact analysis from the perspective of code dependencies in the continuous testing setting. We first applied test impact analysis to one year of software development history in 11 large-scale open-source systems. We found that even though the number of changed files is small in daily commits (median ranges from 3 to 28 files), around 50% or more of the test cases are still impacted and need to be executed. Motivated by our finding, we further studied the code dependencies between source code files and test cases, and among test cases. We found that 1) test cases often focus on testing the integrated behaviour of the systems and 15% of the test cases have dependencies with more than 20 source code files; 2) 18\% of the test cases have dependencies with other test cases, and test case inheritance is the most common cause of test case dependencies; and 3) we documented four dependency-related test smells that we uncovered in our manual study. Our study provides the first step towards studying and understanding the effectiveness of test impact analysis in the continuous testing setting and provides insights on improving test design and execution.</description><subject>Automation</subject><subject>Computer bugs</subject><subject>continuous testing</subject><subject>empirical study</subject><subject>Impact analysis</subject><subject>Maintenance engineering</subject><subject>Manuals</subject><subject>Software</subject><subject>Software development</subject><subject>Source code</subject><subject>test impact analysis</subject><subject>test smells</subject><subject>Testing</subject><issn>0098-5589</issn><issn>1939-3520</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNo9kE1Lw0AQhhdRsFbvgpcFz6mzX8nusdRWCwVFc1_SzUS3tEnMJoX-e7e2eJrDPO_LzEPIPYMJY2Ce8s_5hAOHiQCpDJMXZMSMMIlQHC7JCMDoRCltrslNCBsAUFmmRsR-4N4H3_v6i-YYerrctYXr6bQutofgA_U1nTV13A_NEP6QI7romh3tv5G-YxdadL3fI22qiJZIn7HFusTaeQy35KoqtgHvznNM8sU8n70mq7eX5Wy6ShyXWiZmzVIojaxUWaRgIJWC6UoLJ9dMZ0aUGjMhnClKWDvFKlGYjOu0RF5yLSoxJo-n2rZrfoZ4pN00Qxd_CJanWZpplgmIFJwo1zUhdFjZtvO7ojtYBvZo0UaL9mjRni3GyMMp4hHxHzexTAIXv-ZkbWU</recordid><startdate>20220601</startdate><enddate>20220601</enddate><creator>Peng, Zi</creator><creator>Chen, Tse-Hsun</creator><creator>Yang, Jinqiu</creator><general>IEEE</general><general>IEEE Computer Society</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>JQ2</scope><scope>K9.</scope><orcidid>https://orcid.org/0000-0003-4027-0905</orcidid></search><sort><creationdate>20220601</creationdate><title>Revisiting Test Impact Analysis in Continuous Testing From the Perspective of Code Dependencies</title><author>Peng, Zi ; Chen, Tse-Hsun ; Yang, Jinqiu</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c2484-9b160d94f5da609064318f83c4b18793d8e733c9ad0bc51f3a97286de2d283f3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Automation</topic><topic>Computer bugs</topic><topic>continuous testing</topic><topic>empirical study</topic><topic>Impact analysis</topic><topic>Maintenance engineering</topic><topic>Manuals</topic><topic>Software</topic><topic>Software development</topic><topic>Source code</topic><topic>test impact analysis</topic><topic>test smells</topic><topic>Testing</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Peng, Zi</creatorcontrib><creatorcontrib>Chen, Tse-Hsun</creatorcontrib><creatorcontrib>Yang, Jinqiu</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>ProQuest Computer Science Collection</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><jtitle>IEEE transactions on software engineering</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Peng, Zi</au><au>Chen, Tse-Hsun</au><au>Yang, Jinqiu</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Revisiting Test Impact Analysis in Continuous Testing From the Perspective of Code Dependencies</atitle><jtitle>IEEE transactions on software engineering</jtitle><stitle>TSE</stitle><date>2022-06-01</date><risdate>2022</risdate><volume>48</volume><issue>6</issue><spage>1</spage><epage>1</epage><pages>1-1</pages><issn>0098-5589</issn><eissn>1939-3520</eissn><coden>IESEDJ</coden><abstract>In continuous testing, developers execute automated test cases once or even several times per day to ensure the quality of the integrated code. Although continuous testing helps ensure the quality of the code and reduces maintenance effort, it also significantly increases test execution overhead. In this paper, we empirically evaluate the effectiveness of test impact analysis from the perspective of code dependencies in the continuous testing setting. We first applied test impact analysis to one year of software development history in 11 large-scale open-source systems. We found that even though the number of changed files is small in daily commits (median ranges from 3 to 28 files), around 50% or more of the test cases are still impacted and need to be executed. Motivated by our finding, we further studied the code dependencies between source code files and test cases, and among test cases. We found that 1) test cases often focus on testing the integrated behaviour of the systems and 15% of the test cases have dependencies with more than 20 source code files; 2) 18\% of the test cases have dependencies with other test cases, and test case inheritance is the most common cause of test case dependencies; and 3) we documented four dependency-related test smells that we uncovered in our manual study. Our study provides the first step towards studying and understanding the effectiveness of test impact analysis in the continuous testing setting and provides insights on improving test design and execution.</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/TSE.2020.3045914</doi><tpages>1</tpages><orcidid>https://orcid.org/0000-0003-4027-0905</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 0098-5589 |
ispartof | IEEE transactions on software engineering, 2022-06, Vol.48 (6), p.1-1 |
issn | 0098-5589 1939-3520 |
language | eng |
recordid | cdi_ieee_primary_9303402 |
source | IEEE Electronic Library (IEL) |
subjects | Automation Computer bugs continuous testing empirical study Impact analysis Maintenance engineering Manuals Software Software development Source code test impact analysis test smells Testing |
title | Revisiting Test Impact Analysis in Continuous Testing From the Perspective of Code Dependencies |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-12T18%3A28%3A48IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Revisiting%20Test%20Impact%20Analysis%20in%20Continuous%20Testing%20From%20the%20Perspective%20of%20Code%20Dependencies&rft.jtitle=IEEE%20transactions%20on%20software%20engineering&rft.au=Peng,%20Zi&rft.date=2022-06-01&rft.volume=48&rft.issue=6&rft.spage=1&rft.epage=1&rft.pages=1-1&rft.issn=0098-5589&rft.eissn=1939-3520&rft.coden=IESEDJ&rft_id=info:doi/10.1109/TSE.2020.3045914&rft_dat=%3Cproquest_RIE%3E2676781730%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2676781730&rft_id=info:pmid/&rft_ieee_id=9303402&rfr_iscdi=true |