Dual-Objective Item Selection Criteria in Cognitive Diagnostic Computerized Adaptive Testing
The development of cognitive diagnostic-computerized adaptive testing (CD-CAT) has provided a new perspective for gaining information about examinees' mastery on a set of cognitive attributes. This study proposes a new item selection method within the framework of dual-objective CD-CAT that sim...
Gespeichert in:
Veröffentlicht in: | Journal of educational measurement 2017-06, Vol.54 (2), p.165-183 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 183 |
---|---|
container_issue | 2 |
container_start_page | 165 |
container_title | Journal of educational measurement |
container_volume | 54 |
creator | Kang, Hyeon-Ah Zhang, Susu Chang, Hua-Hua |
description | The development of cognitive diagnostic-computerized adaptive testing (CD-CAT) has provided a new perspective for gaining information about examinees' mastery on a set of cognitive attributes. This study proposes a new item selection method within the framework of dual-objective CD-CAT that simultaneously addresses examinees ' attribute mastery status and overall test performance. The new procedure is based on the Jensen-Shannon (JS) divergence, a symmetrized version of the Kullback-Leibler divergence. We show that the JS divergence resolves the noncomparability problem of the dual information index and has close relationships with Shannon entropy, mutual information, and Fisher information. The performance of the JS divergence is evaluated in simulation studies in comparison with the methods available in the literature. Results suggest that the JS divergence achieves parallel or more precise recovery of latent trait variables compared to the existing methods and maintains practical advantages in computation and item pool usage. |
doi_str_mv | 10.1111/jedm.12139 |
format | Article |
fullrecord | <record><control><sourceid>jstor_proqu</sourceid><recordid>TN_cdi_proquest_journals_1904588681</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ericid>EJ1143126</ericid><jstor_id>45148420</jstor_id><sourcerecordid>45148420</sourcerecordid><originalsourceid>FETCH-LOGICAL-c3459-3aab07bad497d61ce439d14cc9f4a8b2f2b11b5f42a9f8947e9432e0536ba6003</originalsourceid><addsrcrecordid>eNp9kL1PwzAQxS0EEqWwsCNFYkNK8fkjiceqLdAK1IGyIVlO4lSO8oWTgspfj9OgDgx4Ofve797JD6FrwBNw5z7XaTkBAlScoBGEjPtUROwUjTAmxMcB5-foom1zjIGHHEbofb5Thb-Oc5105lN7y06X3qsu-mddeTNrOm2N8oy719vKHKC5UduqbjuTuGbZ7HrkW6feNFXNAdhoJ1bbS3SWqaLVV791jN4eFpvZk_-8flzOps9-QhkXPlUqxmGsUibCNIBEMypSYEkiMqaimGQkBoh5xogSWSRYqAWjRGNOg1gFGNMxuh18G1t_7Nxumdc7W7mVEgRmPIqCCBx1N1CJrdvW6kw21pTK7iVg2acn-_TkIT0H3wyw-1pyBBcrAEaBBE6HQf8yhd7_4yRXi_nLH8-87Wp7nGEcWMQIpj9QOoUW</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1904588681</pqid></control><display><type>article</type><title>Dual-Objective Item Selection Criteria in Cognitive Diagnostic Computerized Adaptive Testing</title><source>Wiley Online Library Journals Frontfile Complete</source><source>Applied Social Sciences Index & Abstracts (ASSIA)</source><source>Jstor Complete Legacy</source><source>Education Source</source><creator>Kang, Hyeon-Ah ; Zhang, Susu ; Chang, Hua-Hua</creator><creatorcontrib>Kang, Hyeon-Ah ; Zhang, Susu ; Chang, Hua-Hua</creatorcontrib><description>The development of cognitive diagnostic-computerized adaptive testing (CD-CAT) has provided a new perspective for gaining information about examinees' mastery on a set of cognitive attributes. This study proposes a new item selection method within the framework of dual-objective CD-CAT that simultaneously addresses examinees ' attribute mastery status and overall test performance. The new procedure is based on the Jensen-Shannon (JS) divergence, a symmetrized version of the Kullback-Leibler divergence. We show that the JS divergence resolves the noncomparability problem of the dual information index and has close relationships with Shannon entropy, mutual information, and Fisher information. The performance of the JS divergence is evaluated in simulation studies in comparison with the methods available in the literature. Results suggest that the JS divergence achieves parallel or more precise recovery of latent trait variables compared to the existing methods and maintains practical advantages in computation and item pool usage.</description><identifier>ISSN: 0022-0655</identifier><identifier>EISSN: 1745-3984</identifier><identifier>DOI: 10.1111/jedm.12139</identifier><language>eng</language><publisher>Madison: National Council on Measurement in Education</publisher><subject>Adaptive Testing ; Attributes ; Cognitive development ; Cognitive Tests ; Computer Assisted Testing ; Computerization ; Entropy ; Item Banks ; Probability ; Recovery ; Selection Criteria ; Simulation ; Statistical Distributions ; Test Items</subject><ispartof>Journal of educational measurement, 2017-06, Vol.54 (2), p.165-183</ispartof><rights>2017 National Council on Measurement in Education</rights><rights>Copyright © 2017 by the National Council on Measurement in Education</rights><rights>2017 by the National Council on Measurement in Education</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c3459-3aab07bad497d61ce439d14cc9f4a8b2f2b11b5f42a9f8947e9432e0536ba6003</citedby><cites>FETCH-LOGICAL-c3459-3aab07bad497d61ce439d14cc9f4a8b2f2b11b5f42a9f8947e9432e0536ba6003</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.jstor.org/stable/pdf/45148420$$EPDF$$P50$$Gjstor$$H</linktopdf><linktohtml>$$Uhttps://www.jstor.org/stable/45148420$$EHTML$$P50$$Gjstor$$H</linktohtml><link.rule.ids>314,776,780,799,1411,27903,27904,30978,45553,45554,57995,58228</link.rule.ids><backlink>$$Uhttp://eric.ed.gov/ERICWebPortal/detail?accno=EJ1143126$$DView record in ERIC$$Hfree_for_read</backlink></links><search><creatorcontrib>Kang, Hyeon-Ah</creatorcontrib><creatorcontrib>Zhang, Susu</creatorcontrib><creatorcontrib>Chang, Hua-Hua</creatorcontrib><title>Dual-Objective Item Selection Criteria in Cognitive Diagnostic Computerized Adaptive Testing</title><title>Journal of educational measurement</title><description>The development of cognitive diagnostic-computerized adaptive testing (CD-CAT) has provided a new perspective for gaining information about examinees' mastery on a set of cognitive attributes. This study proposes a new item selection method within the framework of dual-objective CD-CAT that simultaneously addresses examinees ' attribute mastery status and overall test performance. The new procedure is based on the Jensen-Shannon (JS) divergence, a symmetrized version of the Kullback-Leibler divergence. We show that the JS divergence resolves the noncomparability problem of the dual information index and has close relationships with Shannon entropy, mutual information, and Fisher information. The performance of the JS divergence is evaluated in simulation studies in comparison with the methods available in the literature. Results suggest that the JS divergence achieves parallel or more precise recovery of latent trait variables compared to the existing methods and maintains practical advantages in computation and item pool usage.</description><subject>Adaptive Testing</subject><subject>Attributes</subject><subject>Cognitive development</subject><subject>Cognitive Tests</subject><subject>Computer Assisted Testing</subject><subject>Computerization</subject><subject>Entropy</subject><subject>Item Banks</subject><subject>Probability</subject><subject>Recovery</subject><subject>Selection Criteria</subject><subject>Simulation</subject><subject>Statistical Distributions</subject><subject>Test Items</subject><issn>0022-0655</issn><issn>1745-3984</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2017</creationdate><recordtype>article</recordtype><sourceid>7QJ</sourceid><recordid>eNp9kL1PwzAQxS0EEqWwsCNFYkNK8fkjiceqLdAK1IGyIVlO4lSO8oWTgspfj9OgDgx4Ofve797JD6FrwBNw5z7XaTkBAlScoBGEjPtUROwUjTAmxMcB5-foom1zjIGHHEbofb5Thb-Oc5105lN7y06X3qsu-mddeTNrOm2N8oy719vKHKC5UduqbjuTuGbZ7HrkW6feNFXNAdhoJ1bbS3SWqaLVV791jN4eFpvZk_-8flzOps9-QhkXPlUqxmGsUibCNIBEMypSYEkiMqaimGQkBoh5xogSWSRYqAWjRGNOg1gFGNMxuh18G1t_7Nxumdc7W7mVEgRmPIqCCBx1N1CJrdvW6kw21pTK7iVg2acn-_TkIT0H3wyw-1pyBBcrAEaBBE6HQf8yhd7_4yRXi_nLH8-87Wp7nGEcWMQIpj9QOoUW</recordid><startdate>20170601</startdate><enddate>20170601</enddate><creator>Kang, Hyeon-Ah</creator><creator>Zhang, Susu</creator><creator>Chang, Hua-Hua</creator><general>National Council on Measurement in Education</general><general>Wiley-Blackwell</general><general>Wiley Subscription Services, Inc</general><scope>7SW</scope><scope>BJH</scope><scope>BNH</scope><scope>BNI</scope><scope>BNJ</scope><scope>BNO</scope><scope>ERI</scope><scope>PET</scope><scope>REK</scope><scope>WWN</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7QJ</scope></search><sort><creationdate>20170601</creationdate><title>Dual-Objective Item Selection Criteria in Cognitive Diagnostic Computerized Adaptive Testing</title><author>Kang, Hyeon-Ah ; Zhang, Susu ; Chang, Hua-Hua</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c3459-3aab07bad497d61ce439d14cc9f4a8b2f2b11b5f42a9f8947e9432e0536ba6003</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2017</creationdate><topic>Adaptive Testing</topic><topic>Attributes</topic><topic>Cognitive development</topic><topic>Cognitive Tests</topic><topic>Computer Assisted Testing</topic><topic>Computerization</topic><topic>Entropy</topic><topic>Item Banks</topic><topic>Probability</topic><topic>Recovery</topic><topic>Selection Criteria</topic><topic>Simulation</topic><topic>Statistical Distributions</topic><topic>Test Items</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Kang, Hyeon-Ah</creatorcontrib><creatorcontrib>Zhang, Susu</creatorcontrib><creatorcontrib>Chang, Hua-Hua</creatorcontrib><collection>ERIC</collection><collection>ERIC (Ovid)</collection><collection>ERIC</collection><collection>ERIC</collection><collection>ERIC (Legacy Platform)</collection><collection>ERIC( SilverPlatter )</collection><collection>ERIC</collection><collection>ERIC PlusText (Legacy Platform)</collection><collection>Education Resources Information Center (ERIC)</collection><collection>ERIC</collection><collection>CrossRef</collection><collection>Applied Social Sciences Index & Abstracts (ASSIA)</collection><jtitle>Journal of educational measurement</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Kang, Hyeon-Ah</au><au>Zhang, Susu</au><au>Chang, Hua-Hua</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><ericid>EJ1143126</ericid><atitle>Dual-Objective Item Selection Criteria in Cognitive Diagnostic Computerized Adaptive Testing</atitle><jtitle>Journal of educational measurement</jtitle><date>2017-06-01</date><risdate>2017</risdate><volume>54</volume><issue>2</issue><spage>165</spage><epage>183</epage><pages>165-183</pages><issn>0022-0655</issn><eissn>1745-3984</eissn><abstract>The development of cognitive diagnostic-computerized adaptive testing (CD-CAT) has provided a new perspective for gaining information about examinees' mastery on a set of cognitive attributes. This study proposes a new item selection method within the framework of dual-objective CD-CAT that simultaneously addresses examinees ' attribute mastery status and overall test performance. The new procedure is based on the Jensen-Shannon (JS) divergence, a symmetrized version of the Kullback-Leibler divergence. We show that the JS divergence resolves the noncomparability problem of the dual information index and has close relationships with Shannon entropy, mutual information, and Fisher information. The performance of the JS divergence is evaluated in simulation studies in comparison with the methods available in the literature. Results suggest that the JS divergence achieves parallel or more precise recovery of latent trait variables compared to the existing methods and maintains practical advantages in computation and item pool usage.</abstract><cop>Madison</cop><pub>National Council on Measurement in Education</pub><doi>10.1111/jedm.12139</doi><tpages>19</tpages></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0022-0655 |
ispartof | Journal of educational measurement, 2017-06, Vol.54 (2), p.165-183 |
issn | 0022-0655 1745-3984 |
language | eng |
recordid | cdi_proquest_journals_1904588681 |
source | Wiley Online Library Journals Frontfile Complete; Applied Social Sciences Index & Abstracts (ASSIA); Jstor Complete Legacy; Education Source |
subjects | Adaptive Testing Attributes Cognitive development Cognitive Tests Computer Assisted Testing Computerization Entropy Item Banks Probability Recovery Selection Criteria Simulation Statistical Distributions Test Items |
title | Dual-Objective Item Selection Criteria in Cognitive Diagnostic Computerized Adaptive Testing |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-23T20%3A03%3A15IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-jstor_proqu&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Dual-Objective%20Item%20Selection%20Criteria%20in%20Cognitive%20Diagnostic%20Computerized%20Adaptive%20Testing&rft.jtitle=Journal%20of%20educational%20measurement&rft.au=Kang,%20Hyeon-Ah&rft.date=2017-06-01&rft.volume=54&rft.issue=2&rft.spage=165&rft.epage=183&rft.pages=165-183&rft.issn=0022-0655&rft.eissn=1745-3984&rft_id=info:doi/10.1111/jedm.12139&rft_dat=%3Cjstor_proqu%3E45148420%3C/jstor_proqu%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1904588681&rft_id=info:pmid/&rft_ericid=EJ1143126&rft_jstor_id=45148420&rfr_iscdi=true |