Issues of rigor and feasibility when observing the quality of program implementation: A case study
•Evaluation feasibility issues must be balanced against reliability and validity.•We addressed this balancing when measuring quality of program implementation.•We present a case example, with conclusions. Program evaluators have paid little attention in the literature to the manner in which measurin...
Gespeichert in:
Veröffentlicht in: | Evaluation and program planning 2014-06, Vol.44, p.75-80 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 80 |
---|---|
container_issue | |
container_start_page | 75 |
container_title | Evaluation and program planning |
container_volume | 44 |
creator | Brandon, Paul R. Lawton, Brian E. Harrison, George M. |
description | •Evaluation feasibility issues must be balanced against reliability and validity.•We addressed this balancing when measuring quality of program implementation.•We present a case example, with conclusions.
Program evaluators have paid little attention in the literature to the manner in which measuring the quality of implementation with observations requires tradeoffs between rigor (reliability and validity) and program evaluation feasibility. We present a case example of how we addressed rigor in light of feasibility concerns when developing and conducting observations for measuring the quality of implementation of a small education professional development program. We discuss the results of meta-evaluative analyses of the reliability of the quality observations, and we present conclusions about conducting observations in a rigorous and feasible manner. The results show that the feasibility constraints that we faced did not notably reduce the rigor of our methods. |
doi_str_mv | 10.1016/j.evalprogplan.2014.02.003 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_1541978818</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0149718914000159</els_id><sourcerecordid>3288080551</sourcerecordid><originalsourceid>FETCH-LOGICAL-c384t-e4d65ffb624ab1e2940d3e42ee7eae62bb8d8a53a4a857bf5994b941af47b7e23</originalsourceid><addsrcrecordid>eNqNkU9P3DAQxa2KqmyhXwFZcOklqe1MYpsbgv5BQuqlPVt2Mlm8SuLFThbtt6-3C1XVU08-zO953rxHyCVnJWe8-bQpcWeHbQzr7WCnUjAOJRMlY9UbsuJKVoVUkp2QVR7oQnKlT8n7lDaMMdAS3pFTAU3FFegVcfcpLZho6Gn06xCpnTrao03e-cHPe_r8iBMNLmHc-WlN50ekT4v9Pcqag4loR-rH7YAjTrOdfZiu6Q1tbUKa5qXbn5O3vR0Sfnh5z8jPL59_3H4rHr5_vb-9eSjaSsFcIHRN3feuEWAdR6GBdRWCQJRosRHOqU7ZurJgVS1dX2sNTgO3PUgnUVRn5OPx32zqKd80m9GnFoecEYYlGV4D11IprjJ69Q-6CUucsrtMcd1wgEpn6vpItTGkFLE32-hHG_eGM3NowmzM302YQxOGCZObyOKLlxWLG7H7I32NPgN3RwBzJjuP0aTW49Ri5yO2s-mC_589vwCtM6Ja</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1519614439</pqid></control><display><type>article</type><title>Issues of rigor and feasibility when observing the quality of program implementation: A case study</title><source>Applied Social Sciences Index & Abstracts (ASSIA)</source><source>MEDLINE</source><source>Elsevier ScienceDirect Journals Complete</source><creator>Brandon, Paul R. ; Lawton, Brian E. ; Harrison, George M.</creator><creatorcontrib>Brandon, Paul R. ; Lawton, Brian E. ; Harrison, George M.</creatorcontrib><description>•Evaluation feasibility issues must be balanced against reliability and validity.•We addressed this balancing when measuring quality of program implementation.•We present a case example, with conclusions.
Program evaluators have paid little attention in the literature to the manner in which measuring the quality of implementation with observations requires tradeoffs between rigor (reliability and validity) and program evaluation feasibility. We present a case example of how we addressed rigor in light of feasibility concerns when developing and conducting observations for measuring the quality of implementation of a small education professional development program. We discuss the results of meta-evaluative analyses of the reliability of the quality observations, and we present conclusions about conducting observations in a rigorous and feasible manner. The results show that the feasibility constraints that we faced did not notably reduce the rigor of our methods.</description><identifier>ISSN: 0149-7189</identifier><identifier>EISSN: 1873-7870</identifier><identifier>DOI: 10.1016/j.evalprogplan.2014.02.003</identifier><identifier>PMID: 24631849</identifier><identifier>CODEN: EPPLDO</identifier><language>eng</language><publisher>England: Elsevier Ltd</publisher><subject>Cost Control - methods ; Education, Professional - economics ; Education, Professional - methods ; Education, Professional - standards ; Feasibility ; Feasibility Studies ; Humans ; Observation ; Observations ; Organizational Case Studies ; Professional Competence - standards ; Professional development ; Program Development - economics ; Program Development - methods ; Program Development - standards ; Program Evaluation - economics ; Program Evaluation - methods ; Program Evaluation - standards ; Project evaluation ; Quality of program implementation ; Reliability ; Reproducibility of Results ; Teaching - methods ; Tradeoffs between feasibility and rigor</subject><ispartof>Evaluation and program planning, 2014-06, Vol.44, p.75-80</ispartof><rights>2014 Elsevier Ltd</rights><rights>Copyright © 2014 Elsevier Ltd. All rights reserved.</rights><rights>Copyright Elsevier Science Ltd. Jun 2014</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c384t-e4d65ffb624ab1e2940d3e42ee7eae62bb8d8a53a4a857bf5994b941af47b7e23</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://dx.doi.org/10.1016/j.evalprogplan.2014.02.003$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,780,784,3548,27922,27923,30997,30998,45993</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/24631849$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Brandon, Paul R.</creatorcontrib><creatorcontrib>Lawton, Brian E.</creatorcontrib><creatorcontrib>Harrison, George M.</creatorcontrib><title>Issues of rigor and feasibility when observing the quality of program implementation: A case study</title><title>Evaluation and program planning</title><addtitle>Eval Program Plann</addtitle><description>•Evaluation feasibility issues must be balanced against reliability and validity.•We addressed this balancing when measuring quality of program implementation.•We present a case example, with conclusions.
Program evaluators have paid little attention in the literature to the manner in which measuring the quality of implementation with observations requires tradeoffs between rigor (reliability and validity) and program evaluation feasibility. We present a case example of how we addressed rigor in light of feasibility concerns when developing and conducting observations for measuring the quality of implementation of a small education professional development program. We discuss the results of meta-evaluative analyses of the reliability of the quality observations, and we present conclusions about conducting observations in a rigorous and feasible manner. The results show that the feasibility constraints that we faced did not notably reduce the rigor of our methods.</description><subject>Cost Control - methods</subject><subject>Education, Professional - economics</subject><subject>Education, Professional - methods</subject><subject>Education, Professional - standards</subject><subject>Feasibility</subject><subject>Feasibility Studies</subject><subject>Humans</subject><subject>Observation</subject><subject>Observations</subject><subject>Organizational Case Studies</subject><subject>Professional Competence - standards</subject><subject>Professional development</subject><subject>Program Development - economics</subject><subject>Program Development - methods</subject><subject>Program Development - standards</subject><subject>Program Evaluation - economics</subject><subject>Program Evaluation - methods</subject><subject>Program Evaluation - standards</subject><subject>Project evaluation</subject><subject>Quality of program implementation</subject><subject>Reliability</subject><subject>Reproducibility of Results</subject><subject>Teaching - methods</subject><subject>Tradeoffs between feasibility and rigor</subject><issn>0149-7189</issn><issn>1873-7870</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2014</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><sourceid>7QJ</sourceid><recordid>eNqNkU9P3DAQxa2KqmyhXwFZcOklqe1MYpsbgv5BQuqlPVt2Mlm8SuLFThbtt6-3C1XVU08-zO953rxHyCVnJWe8-bQpcWeHbQzr7WCnUjAOJRMlY9UbsuJKVoVUkp2QVR7oQnKlT8n7lDaMMdAS3pFTAU3FFegVcfcpLZho6Gn06xCpnTrao03e-cHPe_r8iBMNLmHc-WlN50ekT4v9Pcqag4loR-rH7YAjTrOdfZiu6Q1tbUKa5qXbn5O3vR0Sfnh5z8jPL59_3H4rHr5_vb-9eSjaSsFcIHRN3feuEWAdR6GBdRWCQJRosRHOqU7ZurJgVS1dX2sNTgO3PUgnUVRn5OPx32zqKd80m9GnFoecEYYlGV4D11IprjJ69Q-6CUucsrtMcd1wgEpn6vpItTGkFLE32-hHG_eGM3NowmzM302YQxOGCZObyOKLlxWLG7H7I32NPgN3RwBzJjuP0aTW49Ri5yO2s-mC_589vwCtM6Ja</recordid><startdate>20140601</startdate><enddate>20140601</enddate><creator>Brandon, Paul R.</creator><creator>Lawton, Brian E.</creator><creator>Harrison, George M.</creator><general>Elsevier Ltd</general><general>Elsevier Science Ltd</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7QJ</scope><scope>K7.</scope></search><sort><creationdate>20140601</creationdate><title>Issues of rigor and feasibility when observing the quality of program implementation: A case study</title><author>Brandon, Paul R. ; Lawton, Brian E. ; Harrison, George M.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c384t-e4d65ffb624ab1e2940d3e42ee7eae62bb8d8a53a4a857bf5994b941af47b7e23</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2014</creationdate><topic>Cost Control - methods</topic><topic>Education, Professional - economics</topic><topic>Education, Professional - methods</topic><topic>Education, Professional - standards</topic><topic>Feasibility</topic><topic>Feasibility Studies</topic><topic>Humans</topic><topic>Observation</topic><topic>Observations</topic><topic>Organizational Case Studies</topic><topic>Professional Competence - standards</topic><topic>Professional development</topic><topic>Program Development - economics</topic><topic>Program Development - methods</topic><topic>Program Development - standards</topic><topic>Program Evaluation - economics</topic><topic>Program Evaluation - methods</topic><topic>Program Evaluation - standards</topic><topic>Project evaluation</topic><topic>Quality of program implementation</topic><topic>Reliability</topic><topic>Reproducibility of Results</topic><topic>Teaching - methods</topic><topic>Tradeoffs between feasibility and rigor</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Brandon, Paul R.</creatorcontrib><creatorcontrib>Lawton, Brian E.</creatorcontrib><creatorcontrib>Harrison, George M.</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Applied Social Sciences Index & Abstracts (ASSIA)</collection><collection>ProQuest Criminal Justice (Alumni)</collection><jtitle>Evaluation and program planning</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Brandon, Paul R.</au><au>Lawton, Brian E.</au><au>Harrison, George M.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Issues of rigor and feasibility when observing the quality of program implementation: A case study</atitle><jtitle>Evaluation and program planning</jtitle><addtitle>Eval Program Plann</addtitle><date>2014-06-01</date><risdate>2014</risdate><volume>44</volume><spage>75</spage><epage>80</epage><pages>75-80</pages><issn>0149-7189</issn><eissn>1873-7870</eissn><coden>EPPLDO</coden><abstract>•Evaluation feasibility issues must be balanced against reliability and validity.•We addressed this balancing when measuring quality of program implementation.•We present a case example, with conclusions.
Program evaluators have paid little attention in the literature to the manner in which measuring the quality of implementation with observations requires tradeoffs between rigor (reliability and validity) and program evaluation feasibility. We present a case example of how we addressed rigor in light of feasibility concerns when developing and conducting observations for measuring the quality of implementation of a small education professional development program. We discuss the results of meta-evaluative analyses of the reliability of the quality observations, and we present conclusions about conducting observations in a rigorous and feasible manner. The results show that the feasibility constraints that we faced did not notably reduce the rigor of our methods.</abstract><cop>England</cop><pub>Elsevier Ltd</pub><pmid>24631849</pmid><doi>10.1016/j.evalprogplan.2014.02.003</doi><tpages>6</tpages></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0149-7189 |
ispartof | Evaluation and program planning, 2014-06, Vol.44, p.75-80 |
issn | 0149-7189 1873-7870 |
language | eng |
recordid | cdi_proquest_miscellaneous_1541978818 |
source | Applied Social Sciences Index & Abstracts (ASSIA); MEDLINE; Elsevier ScienceDirect Journals Complete |
subjects | Cost Control - methods Education, Professional - economics Education, Professional - methods Education, Professional - standards Feasibility Feasibility Studies Humans Observation Observations Organizational Case Studies Professional Competence - standards Professional development Program Development - economics Program Development - methods Program Development - standards Program Evaluation - economics Program Evaluation - methods Program Evaluation - standards Project evaluation Quality of program implementation Reliability Reproducibility of Results Teaching - methods Tradeoffs between feasibility and rigor |
title | Issues of rigor and feasibility when observing the quality of program implementation: A case study |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-14T14%3A20%3A46IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Issues%20of%20rigor%20and%20feasibility%20when%20observing%20the%20quality%20of%20program%20implementation:%20A%20case%20study&rft.jtitle=Evaluation%20and%20program%20planning&rft.au=Brandon,%20Paul%20R.&rft.date=2014-06-01&rft.volume=44&rft.spage=75&rft.epage=80&rft.pages=75-80&rft.issn=0149-7189&rft.eissn=1873-7870&rft.coden=EPPLDO&rft_id=info:doi/10.1016/j.evalprogplan.2014.02.003&rft_dat=%3Cproquest_cross%3E3288080551%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1519614439&rft_id=info:pmid/24631849&rft_els_id=S0149718914000159&rfr_iscdi=true |