Combining Scores in Multiple-Criteria Assessment Systems: The Impact of Combination Rule

Best practice in gifted and talented identification procedures involves making decisions on the basis of multiple measures. However, very little research has investigated the impact of different methods of combining multiple measures. This article examines the consequences of the conjunctive (“and”)...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:The Gifted child quarterly 2014-01, Vol.58 (1), p.69-89
Hauptverfasser: McBee, Matthew T., Peters, Scott J., Waterman, Craig
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 89
container_issue 1
container_start_page 69
container_title The Gifted child quarterly
container_volume 58
creator McBee, Matthew T.
Peters, Scott J.
Waterman, Craig
description Best practice in gifted and talented identification procedures involves making decisions on the basis of multiple measures. However, very little research has investigated the impact of different methods of combining multiple measures. This article examines the consequences of the conjunctive (“and”), disjunctive/complementary (“or”), and compensatory (“mean”) models for combining scores from multiple assessments. It considers the impact of rule choice on the size of the student population, the ability heterogeneity of the identified students, and the psychometric performance of such systems. It also uses statistical simulation to examine the performance of the state of Georgia’s mandated and complex multiple-criteria assessment system.
doi_str_mv 10.1177/0016986213513794
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_1471917021</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ericid>EJ1020284</ericid><sage_id>10.1177_0016986213513794</sage_id><sourcerecordid>3169382851</sourcerecordid><originalsourceid>FETCH-LOGICAL-c284t-4642171551865807a75ca4b7d00411a4223b4cdb4c94fe20e2b003a84091c30e3</originalsourceid><addsrcrecordid>eNp1kE1LAzEQhoMoWKt3L8KC5-hMkt0kx7LULyoequclm6Ylpbtbk91D_70pKyKCh2EYnvl43yHkGuEOUcp7ACy0KhjyHLnU4oRMUHNBNQg8JZMjpkd-Ti5i3KZSaQYTosquqX3r2022tF1wMfNt9jrser_fOVoG37vgTTaL0cXYuLbPlofYuyZekrO12UV39Z2n5ONh_l4-0cXb43M5W1DLlOipKARDiXmOqsgVSCNza0QtV5B0oRGM8VrYVQot1o6BYzUAN0qARsvB8Sm5HffuQ_c5uNhX224IbTpZoZCoUULyPCUwdtnQxRjcutoH35hwqBCq43-qv_9JIzfjSDJof9rnLwgMkvTE6cij2bhfR__b9wXLHWuW</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1471917021</pqid></control><display><type>article</type><title>Combining Scores in Multiple-Criteria Assessment Systems: The Impact of Combination Rule</title><source>Access via SAGE</source><creator>McBee, Matthew T. ; Peters, Scott J. ; Waterman, Craig</creator><creatorcontrib>McBee, Matthew T. ; Peters, Scott J. ; Waterman, Craig</creatorcontrib><description>Best practice in gifted and talented identification procedures involves making decisions on the basis of multiple measures. However, very little research has investigated the impact of different methods of combining multiple measures. This article examines the consequences of the conjunctive (“and”), disjunctive/complementary (“or”), and compensatory (“mean”) models for combining scores from multiple assessments. It considers the impact of rule choice on the size of the student population, the ability heterogeneity of the identified students, and the psychometric performance of such systems. It also uses statistical simulation to examine the performance of the state of Georgia’s mandated and complex multiple-criteria assessment system.</description><identifier>ISSN: 0016-9862</identifier><identifier>EISSN: 1934-9041</identifier><identifier>DOI: 10.1177/0016986213513794</identifier><identifier>CODEN: GICQAC</identifier><language>eng</language><publisher>Los Angeles, CA: SAGE Publications</publisher><subject>Ability Identification ; Academic Achievement ; Academically Gifted ; Best Practices ; Cognitive Abilities Test ; Cognitive Ability ; Correlation ; Creativity ; Decision Making ; Educational tests &amp; measurements ; Error of Measurement ; Evaluation Criteria ; Georgia ; Gifted children ; Iowa Tests of Basic Skills ; Motivation ; Psychological tests ; Reliability ; Simulation ; Stanford Achievement Tests ; Statistical Analysis ; Student Evaluation ; Student Motivation ; Students ; Testing Programs</subject><ispartof>The Gifted child quarterly, 2014-01, Vol.58 (1), p.69-89</ispartof><rights>2013 National Association for Gifted Children</rights><rights>Copyright SAGE PUBLICATIONS, INC. Jan 2014</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c284t-4642171551865807a75ca4b7d00411a4223b4cdb4c94fe20e2b003a84091c30e3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://journals.sagepub.com/doi/pdf/10.1177/0016986213513794$$EPDF$$P50$$Gsage$$H</linktopdf><linktohtml>$$Uhttps://journals.sagepub.com/doi/10.1177/0016986213513794$$EHTML$$P50$$Gsage$$H</linktohtml><link.rule.ids>315,782,786,21826,27931,27932,43628,43629</link.rule.ids><backlink>$$Uhttp://eric.ed.gov/ERICWebPortal/detail?accno=EJ1020284$$DView record in ERIC$$Hfree_for_read</backlink></links><search><creatorcontrib>McBee, Matthew T.</creatorcontrib><creatorcontrib>Peters, Scott J.</creatorcontrib><creatorcontrib>Waterman, Craig</creatorcontrib><title>Combining Scores in Multiple-Criteria Assessment Systems: The Impact of Combination Rule</title><title>The Gifted child quarterly</title><description>Best practice in gifted and talented identification procedures involves making decisions on the basis of multiple measures. However, very little research has investigated the impact of different methods of combining multiple measures. This article examines the consequences of the conjunctive (“and”), disjunctive/complementary (“or”), and compensatory (“mean”) models for combining scores from multiple assessments. It considers the impact of rule choice on the size of the student population, the ability heterogeneity of the identified students, and the psychometric performance of such systems. It also uses statistical simulation to examine the performance of the state of Georgia’s mandated and complex multiple-criteria assessment system.</description><subject>Ability Identification</subject><subject>Academic Achievement</subject><subject>Academically Gifted</subject><subject>Best Practices</subject><subject>Cognitive Abilities Test</subject><subject>Cognitive Ability</subject><subject>Correlation</subject><subject>Creativity</subject><subject>Decision Making</subject><subject>Educational tests &amp; measurements</subject><subject>Error of Measurement</subject><subject>Evaluation Criteria</subject><subject>Georgia</subject><subject>Gifted children</subject><subject>Iowa Tests of Basic Skills</subject><subject>Motivation</subject><subject>Psychological tests</subject><subject>Reliability</subject><subject>Simulation</subject><subject>Stanford Achievement Tests</subject><subject>Statistical Analysis</subject><subject>Student Evaluation</subject><subject>Student Motivation</subject><subject>Students</subject><subject>Testing Programs</subject><issn>0016-9862</issn><issn>1934-9041</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2014</creationdate><recordtype>article</recordtype><recordid>eNp1kE1LAzEQhoMoWKt3L8KC5-hMkt0kx7LULyoequclm6Ylpbtbk91D_70pKyKCh2EYnvl43yHkGuEOUcp7ACy0KhjyHLnU4oRMUHNBNQg8JZMjpkd-Ti5i3KZSaQYTosquqX3r2022tF1wMfNt9jrser_fOVoG37vgTTaL0cXYuLbPlofYuyZekrO12UV39Z2n5ONh_l4-0cXb43M5W1DLlOipKARDiXmOqsgVSCNza0QtV5B0oRGM8VrYVQot1o6BYzUAN0qARsvB8Sm5HffuQ_c5uNhX224IbTpZoZCoUULyPCUwdtnQxRjcutoH35hwqBCq43-qv_9JIzfjSDJof9rnLwgMkvTE6cij2bhfR__b9wXLHWuW</recordid><startdate>201401</startdate><enddate>201401</enddate><creator>McBee, Matthew T.</creator><creator>Peters, Scott J.</creator><creator>Waterman, Craig</creator><general>SAGE Publications</general><general>SAGE PUBLICATIONS, INC</general><scope>7SW</scope><scope>BJH</scope><scope>BNH</scope><scope>BNI</scope><scope>BNJ</scope><scope>BNO</scope><scope>ERI</scope><scope>PET</scope><scope>REK</scope><scope>WWN</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>4T-</scope></search><sort><creationdate>201401</creationdate><title>Combining Scores in Multiple-Criteria Assessment Systems</title><author>McBee, Matthew T. ; Peters, Scott J. ; Waterman, Craig</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c284t-4642171551865807a75ca4b7d00411a4223b4cdb4c94fe20e2b003a84091c30e3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2014</creationdate><topic>Ability Identification</topic><topic>Academic Achievement</topic><topic>Academically Gifted</topic><topic>Best Practices</topic><topic>Cognitive Abilities Test</topic><topic>Cognitive Ability</topic><topic>Correlation</topic><topic>Creativity</topic><topic>Decision Making</topic><topic>Educational tests &amp; measurements</topic><topic>Error of Measurement</topic><topic>Evaluation Criteria</topic><topic>Georgia</topic><topic>Gifted children</topic><topic>Iowa Tests of Basic Skills</topic><topic>Motivation</topic><topic>Psychological tests</topic><topic>Reliability</topic><topic>Simulation</topic><topic>Stanford Achievement Tests</topic><topic>Statistical Analysis</topic><topic>Student Evaluation</topic><topic>Student Motivation</topic><topic>Students</topic><topic>Testing Programs</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>McBee, Matthew T.</creatorcontrib><creatorcontrib>Peters, Scott J.</creatorcontrib><creatorcontrib>Waterman, Craig</creatorcontrib><collection>ERIC</collection><collection>ERIC (Ovid)</collection><collection>ERIC</collection><collection>ERIC</collection><collection>ERIC (Legacy Platform)</collection><collection>ERIC( SilverPlatter )</collection><collection>ERIC</collection><collection>ERIC PlusText (Legacy Platform)</collection><collection>Education Resources Information Center (ERIC)</collection><collection>ERIC</collection><collection>CrossRef</collection><collection>Docstoc</collection><jtitle>The Gifted child quarterly</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>McBee, Matthew T.</au><au>Peters, Scott J.</au><au>Waterman, Craig</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><ericid>EJ1020284</ericid><atitle>Combining Scores in Multiple-Criteria Assessment Systems: The Impact of Combination Rule</atitle><jtitle>The Gifted child quarterly</jtitle><date>2014-01</date><risdate>2014</risdate><volume>58</volume><issue>1</issue><spage>69</spage><epage>89</epage><pages>69-89</pages><issn>0016-9862</issn><eissn>1934-9041</eissn><coden>GICQAC</coden><abstract>Best practice in gifted and talented identification procedures involves making decisions on the basis of multiple measures. However, very little research has investigated the impact of different methods of combining multiple measures. This article examines the consequences of the conjunctive (“and”), disjunctive/complementary (“or”), and compensatory (“mean”) models for combining scores from multiple assessments. It considers the impact of rule choice on the size of the student population, the ability heterogeneity of the identified students, and the psychometric performance of such systems. It also uses statistical simulation to examine the performance of the state of Georgia’s mandated and complex multiple-criteria assessment system.</abstract><cop>Los Angeles, CA</cop><pub>SAGE Publications</pub><doi>10.1177/0016986213513794</doi><tpages>21</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0016-9862
ispartof The Gifted child quarterly, 2014-01, Vol.58 (1), p.69-89
issn 0016-9862
1934-9041
language eng
recordid cdi_proquest_journals_1471917021
source Access via SAGE
subjects Ability Identification
Academic Achievement
Academically Gifted
Best Practices
Cognitive Abilities Test
Cognitive Ability
Correlation
Creativity
Decision Making
Educational tests & measurements
Error of Measurement
Evaluation Criteria
Georgia
Gifted children
Iowa Tests of Basic Skills
Motivation
Psychological tests
Reliability
Simulation
Stanford Achievement Tests
Statistical Analysis
Student Evaluation
Student Motivation
Students
Testing Programs
title Combining Scores in Multiple-Criteria Assessment Systems: The Impact of Combination Rule
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-04T16%3A33%3A24IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Combining%20Scores%20in%20Multiple-Criteria%20Assessment%20Systems:%20The%20Impact%20of%20Combination%20Rule&rft.jtitle=The%20Gifted%20child%20quarterly&rft.au=McBee,%20Matthew%20T.&rft.date=2014-01&rft.volume=58&rft.issue=1&rft.spage=69&rft.epage=89&rft.pages=69-89&rft.issn=0016-9862&rft.eissn=1934-9041&rft.coden=GICQAC&rft_id=info:doi/10.1177/0016986213513794&rft_dat=%3Cproquest_cross%3E3169382851%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1471917021&rft_id=info:pmid/&rft_ericid=EJ1020284&rft_sage_id=10.1177_0016986213513794&rfr_iscdi=true