Why reports of outcome evaluations are often biased or uninterpretable: Examples from evaluations of drug abuse prevention programs

This paper examines why the conclusions of many outcome evaluations do not stand up to scrutiny drawing upon examples from evaluations of drug abuse prevention programs. It is argued that the factors that undermine the integrity of these studies are not simply due to limited means or resources in co...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Evaluation and program planning 1993, Vol.16 (1), p.1-9
1. Verfasser: Moskowitz, Joel M.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 9
container_issue 1
container_start_page 1
container_title Evaluation and program planning
container_volume 16
creator Moskowitz, Joel M.
description This paper examines why the conclusions of many outcome evaluations do not stand up to scrutiny drawing upon examples from evaluations of drug abuse prevention programs. It is argued that the factors that undermine the integrity of these studies are not simply due to limited means or resources in conducting such research, but they are in large part due to social-structural problems that influence the design and implementation of the research. These include institutional pressures involved in conducting “soft money” research as well as academic pressures to publish or perish and conflict of interest. Some potential solutions are proposed that may reduce the institutional pressures and constraints that undermine evaluation studies.
doi_str_mv 10.1016/0149-7189(93)90032-4
format Article
fullrecord <record><control><sourceid>proquest_repec</sourceid><recordid>TN_cdi_proquest_miscellaneous_61651610</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ericid>EJ460609</ericid><els_id>0149718993900324</els_id><sourcerecordid>1761715570</sourcerecordid><originalsourceid>FETCH-LOGICAL-e356t-3fe29a660540506a80645af5fbb7fd6400e21ce9f479635128fefefda5552db23</originalsourceid><addsrcrecordid>eNp9Uk2P1DAM7QEkloV_sIec0HIoOM1HJxyQ0GqWD63EBcQxSltnN6htSpKOmDN_HHdnBeJCothW3vOzZLuqLji84sD1a-DS1C3fmUsjXhoA0dTyUXX25_tJ9TTn7wAgTSvPql_f7o4s4RJTySx6FtfSxwkZHty4uhLinJlLSFDBmXXBZRxYTGydw1wwLQmL60Z8w_Y_3bSMmJlPcfonnVSHtN4y160ZGWUccN4QCuNtclN-Vj32bsz4_MGfV1-v91-uPtQ3n99_vHp3U6NQutTCY2Oc1qAkKNBuB1oq55XvutYPWgJgw3s0XrZGC8WbnUe6g1NKNUPXiPPqxUmXCv9YMRc7hdzjOLoZ45qt5lpxzYGIl_8l8lbzlivVbtTrE5V6iL1dUphcOlqksyzEtwcrHNdkjltgjCAXtpDecu-NvSsTCV2chDCFvzr7T1KDBkPw2weYGnQImGzuA849DiFhX-wQg-VgtxWw26ztNmtL1e5XwErxG71LqKo</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1761715570</pqid></control><display><type>article</type><title>Why reports of outcome evaluations are often biased or uninterpretable: Examples from evaluations of drug abuse prevention programs</title><source>RePEc</source><source>Elsevier ScienceDirect Journals</source><source>Sociological Abstracts</source><creator>Moskowitz, Joel M.</creator><creatorcontrib>Moskowitz, Joel M.</creatorcontrib><description>This paper examines why the conclusions of many outcome evaluations do not stand up to scrutiny drawing upon examples from evaluations of drug abuse prevention programs. It is argued that the factors that undermine the integrity of these studies are not simply due to limited means or resources in conducting such research, but they are in large part due to social-structural problems that influence the design and implementation of the research. These include institutional pressures involved in conducting “soft money” research as well as academic pressures to publish or perish and conflict of interest. Some potential solutions are proposed that may reduce the institutional pressures and constraints that undermine evaluation studies.</description><identifier>ISSN: 0149-7189</identifier><identifier>DOI: 10.1016/0149-7189(93)90032-4</identifier><language>eng</language><publisher>Elsevier Ltd</publisher><subject>Bias ; Drug Abuse ; Evaluation Problems ; Institutional Characteristics ; Methodological Problems ; Outcome Evaluations ; Prevention ; Program Evaluation ; Research Design ; Research Methodology ; Research Problems ; Social Problems ; Summative Evaluation ; Treatment Programs</subject><ispartof>Evaluation and program planning, 1993, Vol.16 (1), p.1-9</ispartof><rights>1993</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://dx.doi.org/10.1016/0149-7189(93)90032-4$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,777,781,3537,3994,4010,27904,27905,27906,33756,45976</link.rule.ids><backlink>$$Uhttp://eric.ed.gov/ERICWebPortal/detail?accno=EJ460609$$DView record in ERIC$$Hfree_for_read</backlink><backlink>$$Uhttp://econpapers.repec.org/article/eeeepplan/v_3a16_3ay_3a1993_3ai_3a1_3ap_3a1-9.htm$$DView record in RePEc$$Hfree_for_read</backlink></links><search><creatorcontrib>Moskowitz, Joel M.</creatorcontrib><title>Why reports of outcome evaluations are often biased or uninterpretable: Examples from evaluations of drug abuse prevention programs</title><title>Evaluation and program planning</title><description>This paper examines why the conclusions of many outcome evaluations do not stand up to scrutiny drawing upon examples from evaluations of drug abuse prevention programs. It is argued that the factors that undermine the integrity of these studies are not simply due to limited means or resources in conducting such research, but they are in large part due to social-structural problems that influence the design and implementation of the research. These include institutional pressures involved in conducting “soft money” research as well as academic pressures to publish or perish and conflict of interest. Some potential solutions are proposed that may reduce the institutional pressures and constraints that undermine evaluation studies.</description><subject>Bias</subject><subject>Drug Abuse</subject><subject>Evaluation Problems</subject><subject>Institutional Characteristics</subject><subject>Methodological Problems</subject><subject>Outcome Evaluations</subject><subject>Prevention</subject><subject>Program Evaluation</subject><subject>Research Design</subject><subject>Research Methodology</subject><subject>Research Problems</subject><subject>Social Problems</subject><subject>Summative Evaluation</subject><subject>Treatment Programs</subject><issn>0149-7189</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>1993</creationdate><recordtype>article</recordtype><sourceid>X2L</sourceid><sourceid>BHHNA</sourceid><recordid>eNp9Uk2P1DAM7QEkloV_sIec0HIoOM1HJxyQ0GqWD63EBcQxSltnN6htSpKOmDN_HHdnBeJCothW3vOzZLuqLji84sD1a-DS1C3fmUsjXhoA0dTyUXX25_tJ9TTn7wAgTSvPql_f7o4s4RJTySx6FtfSxwkZHty4uhLinJlLSFDBmXXBZRxYTGydw1wwLQmL60Z8w_Y_3bSMmJlPcfonnVSHtN4y160ZGWUccN4QCuNtclN-Vj32bsz4_MGfV1-v91-uPtQ3n99_vHp3U6NQutTCY2Oc1qAkKNBuB1oq55XvutYPWgJgw3s0XrZGC8WbnUe6g1NKNUPXiPPqxUmXCv9YMRc7hdzjOLoZ45qt5lpxzYGIl_8l8lbzlivVbtTrE5V6iL1dUphcOlqksyzEtwcrHNdkjltgjCAXtpDecu-NvSsTCV2chDCFvzr7T1KDBkPw2weYGnQImGzuA849DiFhX-wQg-VgtxWw26ztNmtL1e5XwErxG71LqKo</recordid><startdate>1993</startdate><enddate>1993</enddate><creator>Moskowitz, Joel M.</creator><general>Elsevier Ltd</general><general>Elsevier</general><scope>7SW</scope><scope>BJH</scope><scope>BNH</scope><scope>BNI</scope><scope>BNJ</scope><scope>BNO</scope><scope>ERI</scope><scope>PET</scope><scope>REK</scope><scope>WWN</scope><scope>DKI</scope><scope>X2L</scope><scope>7U3</scope><scope>BHHNA</scope></search><sort><creationdate>1993</creationdate><title>Why reports of outcome evaluations are often biased or uninterpretable: Examples from evaluations of drug abuse prevention programs</title><author>Moskowitz, Joel M.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-e356t-3fe29a660540506a80645af5fbb7fd6400e21ce9f479635128fefefda5552db23</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>1993</creationdate><topic>Bias</topic><topic>Drug Abuse</topic><topic>Evaluation Problems</topic><topic>Institutional Characteristics</topic><topic>Methodological Problems</topic><topic>Outcome Evaluations</topic><topic>Prevention</topic><topic>Program Evaluation</topic><topic>Research Design</topic><topic>Research Methodology</topic><topic>Research Problems</topic><topic>Social Problems</topic><topic>Summative Evaluation</topic><topic>Treatment Programs</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Moskowitz, Joel M.</creatorcontrib><collection>ERIC</collection><collection>ERIC (Ovid)</collection><collection>ERIC</collection><collection>ERIC</collection><collection>ERIC (Legacy Platform)</collection><collection>ERIC( SilverPlatter )</collection><collection>ERIC</collection><collection>ERIC PlusText (Legacy Platform)</collection><collection>Education Resources Information Center (ERIC)</collection><collection>ERIC</collection><collection>RePEc IDEAS</collection><collection>RePEc</collection><collection>Social Services Abstracts</collection><collection>Sociological Abstracts</collection><jtitle>Evaluation and program planning</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Moskowitz, Joel M.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><ericid>EJ460609</ericid><atitle>Why reports of outcome evaluations are often biased or uninterpretable: Examples from evaluations of drug abuse prevention programs</atitle><jtitle>Evaluation and program planning</jtitle><date>1993</date><risdate>1993</risdate><volume>16</volume><issue>1</issue><spage>1</spage><epage>9</epage><pages>1-9</pages><issn>0149-7189</issn><abstract>This paper examines why the conclusions of many outcome evaluations do not stand up to scrutiny drawing upon examples from evaluations of drug abuse prevention programs. It is argued that the factors that undermine the integrity of these studies are not simply due to limited means or resources in conducting such research, but they are in large part due to social-structural problems that influence the design and implementation of the research. These include institutional pressures involved in conducting “soft money” research as well as academic pressures to publish or perish and conflict of interest. Some potential solutions are proposed that may reduce the institutional pressures and constraints that undermine evaluation studies.</abstract><pub>Elsevier Ltd</pub><doi>10.1016/0149-7189(93)90032-4</doi><tpages>9</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0149-7189
ispartof Evaluation and program planning, 1993, Vol.16 (1), p.1-9
issn 0149-7189
language eng
recordid cdi_proquest_miscellaneous_61651610
source RePEc; Elsevier ScienceDirect Journals; Sociological Abstracts
subjects Bias
Drug Abuse
Evaluation Problems
Institutional Characteristics
Methodological Problems
Outcome Evaluations
Prevention
Program Evaluation
Research Design
Research Methodology
Research Problems
Social Problems
Summative Evaluation
Treatment Programs
title Why reports of outcome evaluations are often biased or uninterpretable: Examples from evaluations of drug abuse prevention programs
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-17T15%3A47%3A37IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_repec&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Why%20reports%20of%20outcome%20evaluations%20are%20often%20biased%20or%20uninterpretable:%20Examples%20from%20evaluations%20of%20drug%20abuse%20prevention%20programs&rft.jtitle=Evaluation%20and%20program%20planning&rft.au=Moskowitz,%20Joel%20M.&rft.date=1993&rft.volume=16&rft.issue=1&rft.spage=1&rft.epage=9&rft.pages=1-9&rft.issn=0149-7189&rft_id=info:doi/10.1016/0149-7189(93)90032-4&rft_dat=%3Cproquest_repec%3E1761715570%3C/proquest_repec%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1761715570&rft_id=info:pmid/&rft_ericid=EJ460609&rft_els_id=0149718993900324&rfr_iscdi=true