Evaluation of software visualization tools: Lessons learned

Many software visualization (SoftVis) tools are continuously being developed by both researchers as well as software development companies. In order to determine if the developed tools are effective in helping their target users, it is desirable that they are exposed to a proper evaluation. Despite...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Sensalire, M., Ogao, P., Telea, A.
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 26
container_issue
container_start_page 19
container_title
container_volume
creator Sensalire, M.
Ogao, P.
Telea, A.
description Many software visualization (SoftVis) tools are continuously being developed by both researchers as well as software development companies. In order to determine if the developed tools are effective in helping their target users, it is desirable that they are exposed to a proper evaluation. Despite this, there is still lack of a general guideline on how these evaluations should be carried out and many of the tool developers perform very limited or no evaluation of their tools. Each person that carries out one evaluation, however, has experiences which, if shared, can guide future evaluators. This paper presents the lessons learned from evaluating over 20 SoftVis tools with over 90 users in five different studies spread on a period of over two years. The lessons covered include the selection of the tools, tasks, as well as evaluation participants. Other discussed points are related to the duration of the evaluation experiment, its location, the procedure followed when carrying out the experiment, as well as motivation of the participants. Finally, an analysis of the lessons learned is shown with the hope that these lessons will be of some assistance to future SoftVis tool evaluators.
doi_str_mv 10.1109/VISSOF.2009.5336431
format Conference Proceeding
fullrecord <record><control><sourceid>ieee_6IE</sourceid><recordid>TN_cdi_ieee_primary_5336431</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>5336431</ieee_id><sourcerecordid>5336431</sourcerecordid><originalsourceid>FETCH-LOGICAL-i220t-1a835f7220844acb5e93ae3a282ecd9897b72de213c3ecbde2e205f37ca2e69b3</originalsourceid><addsrcrecordid>eNo1T1FLwzAYjMhAN_sL9tI_0PnlS9Ik-iRjm4PCHqa-jrT9CpHaSNNN9Ndb2byXu-OOg2NszmHBOdj7t-1-v1svEMAulBC5FPyKJVYbLlFKBZibazb9NzqfsOlf14IGhBuWxPgOI6RCC_aWPa5Orj26wYcuDU0aQzN8uZ7Sk49H1_qfczKE0MaHtKAYQxfTllzfUX3HJo1rIyUXnrHX9epl-ZwVu812-VRkHhGGjDsjVKNHbaR0VanICkfCoUGqamusLjXWhFxUgqpyVISgGqErh5TbUszY_Lzriejw2fsP138fLufFL4IzTE0</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Evaluation of software visualization tools: Lessons learned</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>Sensalire, M. ; Ogao, P. ; Telea, A.</creator><creatorcontrib>Sensalire, M. ; Ogao, P. ; Telea, A.</creatorcontrib><description>Many software visualization (SoftVis) tools are continuously being developed by both researchers as well as software development companies. In order to determine if the developed tools are effective in helping their target users, it is desirable that they are exposed to a proper evaluation. Despite this, there is still lack of a general guideline on how these evaluations should be carried out and many of the tool developers perform very limited or no evaluation of their tools. Each person that carries out one evaluation, however, has experiences which, if shared, can guide future evaluators. This paper presents the lessons learned from evaluating over 20 SoftVis tools with over 90 users in five different studies spread on a period of over two years. The lessons covered include the selection of the tools, tasks, as well as evaluation participants. Other discussed points are related to the duration of the evaluation experiment, its location, the procedure followed when carrying out the experiment, as well as motivation of the participants. Finally, an analysis of the lessons learned is shown with the hope that these lessons will be of some assistance to future SoftVis tool evaluators.</description><identifier>ISBN: 1424450276</identifier><identifier>ISBN: 9781424450275</identifier><identifier>EISBN: 9781424450268</identifier><identifier>EISBN: 1424450268</identifier><identifier>DOI: 10.1109/VISSOF.2009.5336431</identifier><identifier>LCCN: 2009907020</identifier><language>eng</language><publisher>IEEE</publisher><subject>Animation ; Cinematography ; Computer graphics ; Design engineering ; Guidelines ; Mathematics ; Performance evaluation ; Programming ; Software tools ; Visualization</subject><ispartof>2009 5th IEEE International Workshop on Visualizing Software for Understanding and Analysis, 2009, p.19-26</ispartof><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/5336431$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,780,784,789,790,2058,27925,54920</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/5336431$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Sensalire, M.</creatorcontrib><creatorcontrib>Ogao, P.</creatorcontrib><creatorcontrib>Telea, A.</creatorcontrib><title>Evaluation of software visualization tools: Lessons learned</title><title>2009 5th IEEE International Workshop on Visualizing Software for Understanding and Analysis</title><addtitle>VISSOF</addtitle><description>Many software visualization (SoftVis) tools are continuously being developed by both researchers as well as software development companies. In order to determine if the developed tools are effective in helping their target users, it is desirable that they are exposed to a proper evaluation. Despite this, there is still lack of a general guideline on how these evaluations should be carried out and many of the tool developers perform very limited or no evaluation of their tools. Each person that carries out one evaluation, however, has experiences which, if shared, can guide future evaluators. This paper presents the lessons learned from evaluating over 20 SoftVis tools with over 90 users in five different studies spread on a period of over two years. The lessons covered include the selection of the tools, tasks, as well as evaluation participants. Other discussed points are related to the duration of the evaluation experiment, its location, the procedure followed when carrying out the experiment, as well as motivation of the participants. Finally, an analysis of the lessons learned is shown with the hope that these lessons will be of some assistance to future SoftVis tool evaluators.</description><subject>Animation</subject><subject>Cinematography</subject><subject>Computer graphics</subject><subject>Design engineering</subject><subject>Guidelines</subject><subject>Mathematics</subject><subject>Performance evaluation</subject><subject>Programming</subject><subject>Software tools</subject><subject>Visualization</subject><isbn>1424450276</isbn><isbn>9781424450275</isbn><isbn>9781424450268</isbn><isbn>1424450268</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2009</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><sourceid>RIE</sourceid><recordid>eNo1T1FLwzAYjMhAN_sL9tI_0PnlS9Ik-iRjm4PCHqa-jrT9CpHaSNNN9Ndb2byXu-OOg2NszmHBOdj7t-1-v1svEMAulBC5FPyKJVYbLlFKBZibazb9NzqfsOlf14IGhBuWxPgOI6RCC_aWPa5Orj26wYcuDU0aQzN8uZ7Sk49H1_qfczKE0MaHtKAYQxfTllzfUX3HJo1rIyUXnrHX9epl-ZwVu812-VRkHhGGjDsjVKNHbaR0VanICkfCoUGqamusLjXWhFxUgqpyVISgGqErh5TbUszY_Lzriejw2fsP138fLufFL4IzTE0</recordid><startdate>200909</startdate><enddate>200909</enddate><creator>Sensalire, M.</creator><creator>Ogao, P.</creator><creator>Telea, A.</creator><general>IEEE</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>200909</creationdate><title>Evaluation of software visualization tools: Lessons learned</title><author>Sensalire, M. ; Ogao, P. ; Telea, A.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i220t-1a835f7220844acb5e93ae3a282ecd9897b72de213c3ecbde2e205f37ca2e69b3</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2009</creationdate><topic>Animation</topic><topic>Cinematography</topic><topic>Computer graphics</topic><topic>Design engineering</topic><topic>Guidelines</topic><topic>Mathematics</topic><topic>Performance evaluation</topic><topic>Programming</topic><topic>Software tools</topic><topic>Visualization</topic><toplevel>online_resources</toplevel><creatorcontrib>Sensalire, M.</creatorcontrib><creatorcontrib>Ogao, P.</creatorcontrib><creatorcontrib>Telea, A.</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Electronic Library (IEL)</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Sensalire, M.</au><au>Ogao, P.</au><au>Telea, A.</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Evaluation of software visualization tools: Lessons learned</atitle><btitle>2009 5th IEEE International Workshop on Visualizing Software for Understanding and Analysis</btitle><stitle>VISSOF</stitle><date>2009-09</date><risdate>2009</risdate><spage>19</spage><epage>26</epage><pages>19-26</pages><isbn>1424450276</isbn><isbn>9781424450275</isbn><eisbn>9781424450268</eisbn><eisbn>1424450268</eisbn><abstract>Many software visualization (SoftVis) tools are continuously being developed by both researchers as well as software development companies. In order to determine if the developed tools are effective in helping their target users, it is desirable that they are exposed to a proper evaluation. Despite this, there is still lack of a general guideline on how these evaluations should be carried out and many of the tool developers perform very limited or no evaluation of their tools. Each person that carries out one evaluation, however, has experiences which, if shared, can guide future evaluators. This paper presents the lessons learned from evaluating over 20 SoftVis tools with over 90 users in five different studies spread on a period of over two years. The lessons covered include the selection of the tools, tasks, as well as evaluation participants. Other discussed points are related to the duration of the evaluation experiment, its location, the procedure followed when carrying out the experiment, as well as motivation of the participants. Finally, an analysis of the lessons learned is shown with the hope that these lessons will be of some assistance to future SoftVis tool evaluators.</abstract><pub>IEEE</pub><doi>10.1109/VISSOF.2009.5336431</doi><tpages>8</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier ISBN: 1424450276
ispartof 2009 5th IEEE International Workshop on Visualizing Software for Understanding and Analysis, 2009, p.19-26
issn
language eng
recordid cdi_ieee_primary_5336431
source IEEE Electronic Library (IEL) Conference Proceedings
subjects Animation
Cinematography
Computer graphics
Design engineering
Guidelines
Mathematics
Performance evaluation
Programming
Software tools
Visualization
title Evaluation of software visualization tools: Lessons learned
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-24T17%3A17%3A41IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_6IE&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Evaluation%20of%20software%20visualization%20tools:%20Lessons%20learned&rft.btitle=2009%205th%20IEEE%20International%20Workshop%20on%20Visualizing%20Software%20for%20Understanding%20and%20Analysis&rft.au=Sensalire,%20M.&rft.date=2009-09&rft.spage=19&rft.epage=26&rft.pages=19-26&rft.isbn=1424450276&rft.isbn_list=9781424450275&rft_id=info:doi/10.1109/VISSOF.2009.5336431&rft_dat=%3Cieee_6IE%3E5336431%3C/ieee_6IE%3E%3Curl%3E%3C/url%3E&rft.eisbn=9781424450268&rft.eisbn_list=1424450268&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=5336431&rfr_iscdi=true