Evolutionary Evaluation: Implications for evaluators, researchers, practitioners, funders and the evidence-based program mandate

•Present the theoretical foundations for Evolutionary Evaluation (EE)•Operationalize EE by defining program and evaluation evolutionary phases.•Discuss phase alignment to ensure optimal decision-making for programs &evaluation.•Discuss implications of EE for how “evidence-based programs” are def...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Evaluation and program planning 2014-08, Vol.45, p.127-139
Hauptverfasser: Urban, Jennifer Brown, Hargraves, Monica, Trochim, William M.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 139
container_issue
container_start_page 127
container_title Evaluation and program planning
container_volume 45
creator Urban, Jennifer Brown
Hargraves, Monica
Trochim, William M.
description •Present the theoretical foundations for Evolutionary Evaluation (EE)•Operationalize EE by defining program and evaluation evolutionary phases.•Discuss phase alignment to ensure optimal decision-making for programs &evaluation.•Discuss implications of EE for how “evidence-based programs” are defined.•Discuss implications of EE for management of individual & portfolios of programs. Evolutionary theory, developmental systems theory, and evolutionary epistemology provide deep theoretical foundations for understanding programs, their development over time, and the role of evaluation. This paper relates core concepts from these powerful bodies of theory to program evaluation. Evolutionary Evaluation is operationalized in terms of program and evaluation evolutionary phases, which are in turn aligned with multiple types of validity. The model of Evolutionary Evaluation incorporates Chen's conceptualization of bottom-up versus top-down program development. The resulting framework has important implications for many program management and evaluation issues. The paper illustrates how an Evolutionary Evaluation perspective can illuminate important controversies in evaluation using the example of the appropriate role of randomized controlled trials that encourages a rethinking of “evidence-based programs”. From an Evolutionary Evaluation perspective, prevailing interpretations of rigor and mandates for evidence-based programs pose significant challenges to program evolution. This perspective also illuminates the consequences of misalignment between program and evaluation phases; the importance of supporting both researcher-derived and practitioner-derived programs; and the need for variation and evolutionary phase diversity within portfolios of programs.
doi_str_mv 10.1016/j.evalprogplan.2014.03.011
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_1550985633</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0149718914000378</els_id><sourcerecordid>3340709041</sourcerecordid><originalsourceid>FETCH-LOGICAL-c441t-56da50f32607715383311c66f4ab7a9184a8724cded78b51ec45f688dbbb7f463</originalsourceid><addsrcrecordid>eNqNkc1u1DAUhS0EotPCKyALNiya4Bs7sdNdVQaoVIkNrC3HvqEeJXGwk5HY8eg4nYIQK1b20f3uj84h5DWwEhg07w4lHs0wx_BtHsxUVgxEyXjJAJ6QHSjJC6kke0p2udAWElR7Rs5TOjDGRCvFc3JWCalYpWBHfu6PYVgXHyYTf9B9nruaTV3R23EevH0QifYhUjwVQ0yXNGJCE-09bmKOxi5-Ax9kv04uf6iZHF3uMfd5h5PFojMJHd3ujmakY66bBV-QZ70ZEr58fC_I1w_7LzefirvPH29vru8KKwQsRd04U7OeVw2TEmquOAewTdML00nTghJGyUpYh06qrga0ou4bpVzXdbIXDb8gb09z8_7vK6ZFjz5ZHLKDGNakoa5Zq-qG84y--Qc9hDVO-bpMcakEE1Bl6upE2RhSitjrOfoxu6iB6S0nfdB_56S3nDTjOueUm189rli7Ed2f1t_BZOD9CcDsydFj1Mn6zUXnI9pFu-D_Z88vDx-tGw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1537840412</pqid></control><display><type>article</type><title>Evolutionary Evaluation: Implications for evaluators, researchers, practitioners, funders and the evidence-based program mandate</title><source>Applied Social Sciences Index &amp; Abstracts (ASSIA)</source><source>MEDLINE</source><source>Elsevier ScienceDirect Journals</source><creator>Urban, Jennifer Brown ; Hargraves, Monica ; Trochim, William M.</creator><creatorcontrib>Urban, Jennifer Brown ; Hargraves, Monica ; Trochim, William M.</creatorcontrib><description>•Present the theoretical foundations for Evolutionary Evaluation (EE)•Operationalize EE by defining program and evaluation evolutionary phases.•Discuss phase alignment to ensure optimal decision-making for programs &amp;evaluation.•Discuss implications of EE for how “evidence-based programs” are defined.•Discuss implications of EE for management of individual &amp; portfolios of programs. Evolutionary theory, developmental systems theory, and evolutionary epistemology provide deep theoretical foundations for understanding programs, their development over time, and the role of evaluation. This paper relates core concepts from these powerful bodies of theory to program evaluation. Evolutionary Evaluation is operationalized in terms of program and evaluation evolutionary phases, which are in turn aligned with multiple types of validity. The model of Evolutionary Evaluation incorporates Chen's conceptualization of bottom-up versus top-down program development. The resulting framework has important implications for many program management and evaluation issues. The paper illustrates how an Evolutionary Evaluation perspective can illuminate important controversies in evaluation using the example of the appropriate role of randomized controlled trials that encourages a rethinking of “evidence-based programs”. From an Evolutionary Evaluation perspective, prevailing interpretations of rigor and mandates for evidence-based programs pose significant challenges to program evolution. This perspective also illuminates the consequences of misalignment between program and evaluation phases; the importance of supporting both researcher-derived and practitioner-derived programs; and the need for variation and evolutionary phase diversity within portfolios of programs.</description><identifier>ISSN: 0149-7189</identifier><identifier>EISSN: 1873-7870</identifier><identifier>DOI: 10.1016/j.evalprogplan.2014.03.011</identifier><identifier>PMID: 24780281</identifier><identifier>CODEN: EPPLDO</identifier><language>eng</language><publisher>England: Elsevier Ltd</publisher><subject>Conceptualization ; Controversy ; Developmental systems theory ; Epistemology ; Evaluation design ; Evaluators ; Evidence ; Evidence based ; Evidence-based program (EBP) ; Evolution ; Evolutionary epistemology ; Evolutionary Evaluation ; Evolutionary theory ; Experimental design ; Humans ; Lifecycles ; Portfolios ; Program Administration ; Program Development ; Program Development - methods ; Program Evaluation ; Program Evaluation - methods ; Program evolution ; Project evaluation ; Project management ; Randomized controlled trial (RCT) ; Randomized Controlled Trials ; Randomized Controlled Trials as Topic ; Research Design ; Systems Approach ; Systems evaluation ; Systems Theory ; Theory ; Validity</subject><ispartof>Evaluation and program planning, 2014-08, Vol.45, p.127-139</ispartof><rights>2014 Elsevier Ltd</rights><rights>Copyright © 2014 Elsevier Ltd. All rights reserved.</rights><rights>Copyright Elsevier Science Ltd. Aug 2014</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c441t-56da50f32607715383311c66f4ab7a9184a8724cded78b51ec45f688dbbb7f463</citedby><cites>FETCH-LOGICAL-c441t-56da50f32607715383311c66f4ab7a9184a8724cded78b51ec45f688dbbb7f463</cites><orcidid>0000-0002-6074-4445</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.sciencedirect.com/science/article/pii/S0149718914000378$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,776,780,3537,27901,27902,30976,30977,65306</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/24780281$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Urban, Jennifer Brown</creatorcontrib><creatorcontrib>Hargraves, Monica</creatorcontrib><creatorcontrib>Trochim, William M.</creatorcontrib><title>Evolutionary Evaluation: Implications for evaluators, researchers, practitioners, funders and the evidence-based program mandate</title><title>Evaluation and program planning</title><addtitle>Eval Program Plann</addtitle><description>•Present the theoretical foundations for Evolutionary Evaluation (EE)•Operationalize EE by defining program and evaluation evolutionary phases.•Discuss phase alignment to ensure optimal decision-making for programs &amp;evaluation.•Discuss implications of EE for how “evidence-based programs” are defined.•Discuss implications of EE for management of individual &amp; portfolios of programs. Evolutionary theory, developmental systems theory, and evolutionary epistemology provide deep theoretical foundations for understanding programs, their development over time, and the role of evaluation. This paper relates core concepts from these powerful bodies of theory to program evaluation. Evolutionary Evaluation is operationalized in terms of program and evaluation evolutionary phases, which are in turn aligned with multiple types of validity. The model of Evolutionary Evaluation incorporates Chen's conceptualization of bottom-up versus top-down program development. The resulting framework has important implications for many program management and evaluation issues. The paper illustrates how an Evolutionary Evaluation perspective can illuminate important controversies in evaluation using the example of the appropriate role of randomized controlled trials that encourages a rethinking of “evidence-based programs”. From an Evolutionary Evaluation perspective, prevailing interpretations of rigor and mandates for evidence-based programs pose significant challenges to program evolution. This perspective also illuminates the consequences of misalignment between program and evaluation phases; the importance of supporting both researcher-derived and practitioner-derived programs; and the need for variation and evolutionary phase diversity within portfolios of programs.</description><subject>Conceptualization</subject><subject>Controversy</subject><subject>Developmental systems theory</subject><subject>Epistemology</subject><subject>Evaluation design</subject><subject>Evaluators</subject><subject>Evidence</subject><subject>Evidence based</subject><subject>Evidence-based program (EBP)</subject><subject>Evolution</subject><subject>Evolutionary epistemology</subject><subject>Evolutionary Evaluation</subject><subject>Evolutionary theory</subject><subject>Experimental design</subject><subject>Humans</subject><subject>Lifecycles</subject><subject>Portfolios</subject><subject>Program Administration</subject><subject>Program Development</subject><subject>Program Development - methods</subject><subject>Program Evaluation</subject><subject>Program Evaluation - methods</subject><subject>Program evolution</subject><subject>Project evaluation</subject><subject>Project management</subject><subject>Randomized controlled trial (RCT)</subject><subject>Randomized Controlled Trials</subject><subject>Randomized Controlled Trials as Topic</subject><subject>Research Design</subject><subject>Systems Approach</subject><subject>Systems evaluation</subject><subject>Systems Theory</subject><subject>Theory</subject><subject>Validity</subject><issn>0149-7189</issn><issn>1873-7870</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2014</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><sourceid>7QJ</sourceid><recordid>eNqNkc1u1DAUhS0EotPCKyALNiya4Bs7sdNdVQaoVIkNrC3HvqEeJXGwk5HY8eg4nYIQK1b20f3uj84h5DWwEhg07w4lHs0wx_BtHsxUVgxEyXjJAJ6QHSjJC6kke0p2udAWElR7Rs5TOjDGRCvFc3JWCalYpWBHfu6PYVgXHyYTf9B9nruaTV3R23EevH0QifYhUjwVQ0yXNGJCE-09bmKOxi5-Ax9kv04uf6iZHF3uMfd5h5PFojMJHd3ujmakY66bBV-QZ70ZEr58fC_I1w_7LzefirvPH29vru8KKwQsRd04U7OeVw2TEmquOAewTdML00nTghJGyUpYh06qrga0ou4bpVzXdbIXDb8gb09z8_7vK6ZFjz5ZHLKDGNakoa5Zq-qG84y--Qc9hDVO-bpMcakEE1Bl6upE2RhSitjrOfoxu6iB6S0nfdB_56S3nDTjOueUm189rli7Ed2f1t_BZOD9CcDsydFj1Mn6zUXnI9pFu-D_Z88vDx-tGw</recordid><startdate>20140801</startdate><enddate>20140801</enddate><creator>Urban, Jennifer Brown</creator><creator>Hargraves, Monica</creator><creator>Trochim, William M.</creator><general>Elsevier Ltd</general><general>Elsevier Science Ltd</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7QJ</scope><scope>K7.</scope><orcidid>https://orcid.org/0000-0002-6074-4445</orcidid></search><sort><creationdate>20140801</creationdate><title>Evolutionary Evaluation: Implications for evaluators, researchers, practitioners, funders and the evidence-based program mandate</title><author>Urban, Jennifer Brown ; Hargraves, Monica ; Trochim, William M.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c441t-56da50f32607715383311c66f4ab7a9184a8724cded78b51ec45f688dbbb7f463</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2014</creationdate><topic>Conceptualization</topic><topic>Controversy</topic><topic>Developmental systems theory</topic><topic>Epistemology</topic><topic>Evaluation design</topic><topic>Evaluators</topic><topic>Evidence</topic><topic>Evidence based</topic><topic>Evidence-based program (EBP)</topic><topic>Evolution</topic><topic>Evolutionary epistemology</topic><topic>Evolutionary Evaluation</topic><topic>Evolutionary theory</topic><topic>Experimental design</topic><topic>Humans</topic><topic>Lifecycles</topic><topic>Portfolios</topic><topic>Program Administration</topic><topic>Program Development</topic><topic>Program Development - methods</topic><topic>Program Evaluation</topic><topic>Program Evaluation - methods</topic><topic>Program evolution</topic><topic>Project evaluation</topic><topic>Project management</topic><topic>Randomized controlled trial (RCT)</topic><topic>Randomized Controlled Trials</topic><topic>Randomized Controlled Trials as Topic</topic><topic>Research Design</topic><topic>Systems Approach</topic><topic>Systems evaluation</topic><topic>Systems Theory</topic><topic>Theory</topic><topic>Validity</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Urban, Jennifer Brown</creatorcontrib><creatorcontrib>Hargraves, Monica</creatorcontrib><creatorcontrib>Trochim, William M.</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Applied Social Sciences Index &amp; Abstracts (ASSIA)</collection><collection>ProQuest Criminal Justice (Alumni)</collection><jtitle>Evaluation and program planning</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Urban, Jennifer Brown</au><au>Hargraves, Monica</au><au>Trochim, William M.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Evolutionary Evaluation: Implications for evaluators, researchers, practitioners, funders and the evidence-based program mandate</atitle><jtitle>Evaluation and program planning</jtitle><addtitle>Eval Program Plann</addtitle><date>2014-08-01</date><risdate>2014</risdate><volume>45</volume><spage>127</spage><epage>139</epage><pages>127-139</pages><issn>0149-7189</issn><eissn>1873-7870</eissn><coden>EPPLDO</coden><abstract>•Present the theoretical foundations for Evolutionary Evaluation (EE)•Operationalize EE by defining program and evaluation evolutionary phases.•Discuss phase alignment to ensure optimal decision-making for programs &amp;evaluation.•Discuss implications of EE for how “evidence-based programs” are defined.•Discuss implications of EE for management of individual &amp; portfolios of programs. Evolutionary theory, developmental systems theory, and evolutionary epistemology provide deep theoretical foundations for understanding programs, their development over time, and the role of evaluation. This paper relates core concepts from these powerful bodies of theory to program evaluation. Evolutionary Evaluation is operationalized in terms of program and evaluation evolutionary phases, which are in turn aligned with multiple types of validity. The model of Evolutionary Evaluation incorporates Chen's conceptualization of bottom-up versus top-down program development. The resulting framework has important implications for many program management and evaluation issues. The paper illustrates how an Evolutionary Evaluation perspective can illuminate important controversies in evaluation using the example of the appropriate role of randomized controlled trials that encourages a rethinking of “evidence-based programs”. From an Evolutionary Evaluation perspective, prevailing interpretations of rigor and mandates for evidence-based programs pose significant challenges to program evolution. This perspective also illuminates the consequences of misalignment between program and evaluation phases; the importance of supporting both researcher-derived and practitioner-derived programs; and the need for variation and evolutionary phase diversity within portfolios of programs.</abstract><cop>England</cop><pub>Elsevier Ltd</pub><pmid>24780281</pmid><doi>10.1016/j.evalprogplan.2014.03.011</doi><tpages>13</tpages><orcidid>https://orcid.org/0000-0002-6074-4445</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0149-7189
ispartof Evaluation and program planning, 2014-08, Vol.45, p.127-139
issn 0149-7189
1873-7870
language eng
recordid cdi_proquest_miscellaneous_1550985633
source Applied Social Sciences Index & Abstracts (ASSIA); MEDLINE; Elsevier ScienceDirect Journals
subjects Conceptualization
Controversy
Developmental systems theory
Epistemology
Evaluation design
Evaluators
Evidence
Evidence based
Evidence-based program (EBP)
Evolution
Evolutionary epistemology
Evolutionary Evaluation
Evolutionary theory
Experimental design
Humans
Lifecycles
Portfolios
Program Administration
Program Development
Program Development - methods
Program Evaluation
Program Evaluation - methods
Program evolution
Project evaluation
Project management
Randomized controlled trial (RCT)
Randomized Controlled Trials
Randomized Controlled Trials as Topic
Research Design
Systems Approach
Systems evaluation
Systems Theory
Theory
Validity
title Evolutionary Evaluation: Implications for evaluators, researchers, practitioners, funders and the evidence-based program mandate
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-29T21%3A34%3A54IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Evolutionary%20Evaluation:%20Implications%20for%20evaluators,%20researchers,%20practitioners,%20funders%20and%20the%20evidence-based%20program%20mandate&rft.jtitle=Evaluation%20and%20program%20planning&rft.au=Urban,%20Jennifer%20Brown&rft.date=2014-08-01&rft.volume=45&rft.spage=127&rft.epage=139&rft.pages=127-139&rft.issn=0149-7189&rft.eissn=1873-7870&rft.coden=EPPLDO&rft_id=info:doi/10.1016/j.evalprogplan.2014.03.011&rft_dat=%3Cproquest_cross%3E3340709041%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1537840412&rft_id=info:pmid/24780281&rft_els_id=S0149718914000378&rfr_iscdi=true