Partial Identification of Answer Reviewing Effects in Multiple‐Choice Exams
Does reviewing previous answers during multiple‐choice exams help examinees increase their final score? This article formalizes the question using a rigorous causal framework, the potential outcomes framework. Viewing examinees’ reviewing status as a treatment and their final score as an outcome, th...
Gespeichert in:
Veröffentlicht in: | Journal of educational measurement 2020-12, Vol.57 (4), p.511-526 |
---|---|
1. Verfasser: | |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 526 |
---|---|
container_issue | 4 |
container_start_page | 511 |
container_title | Journal of educational measurement |
container_volume | 57 |
creator | Kim, Yongnam |
description | Does reviewing previous answers during multiple‐choice exams help examinees increase their final score? This article formalizes the question using a rigorous causal framework, the potential outcomes framework. Viewing examinees’ reviewing status as a treatment and their final score as an outcome, the article first explains the challenges of identifying the causal effect of answer reviewing in regular exam‐taking settings. In addition to the incapability of randomizing the treatment selection (reviewing status) and the lack of other information to make this selection process ignorable, the treatment variable itself is not fully known to researchers. Looking at examinees’ answer sheet data, it is unclear whether an examinee who did not change his or her answer on a specific item reviewed it but retained the initial answer (treatment condition) or chose not to review it (control condition). Despite such challenges, however, the article develops partial identification strategies and shows that the sign of the answer reviewing effect can be reasonably inferred. By analyzing a statewide math assessment data set, the article finds that reviewing initial answers is generally beneficial for examinees. |
doi_str_mv | 10.1111/jedm.12259 |
format | Article |
fullrecord | <record><control><sourceid>proquest_webof</sourceid><recordid>TN_cdi_webofscience_primary_000492462600001CitationCount</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ericid>EJ1277317</ericid><sourcerecordid>2468064314</sourcerecordid><originalsourceid>FETCH-LOGICAL-c3239-e54a0c7fd8efbeafa6316cc2695d3dcd7860104a4c1dd12475e8ca74e954ac893</originalsourceid><addsrcrecordid>eNqNkMtOwzAQRS0EEuWxYY8UiR0oYDt-JEsUwqNqBUKwjowzBldpUuyU0h2fwDfyJbgNYomYzYw0584dXYQOCD4loc4mUE1PCaU820ADIhmPkyxlm2iAMaUxFpxvox3vJxgTLjkZoPGdcp1VdXRTQdNZY7XqbNtErYnOG78AF93Dm4WFbZ6jwhjQnY9sE43ndWdnNXx9fOYvrdUQFe9q6vfQllG1h_2fvoseL4uH_Doe3V7d5OejWCc0yWLgTGEtTZWCeQJllEiI0JqKjFdJpSuZCkwwU0yTqiKUSQ6pVpJBFoQ6zZJddNTfnbn2dQ6-Kyft3DXBsqRMpFiwhLBAHfeUdq33Dkw5c3aq3LIkuFzFVa7iKtdxBfiwh8FZ_QsWQ0KlTIgM-7TfL-CpNV5baDT8chhjlgVrKsKESW67dYx5O2-6ID35vzTQ5Ie2NSz_eLkcFhfj_vlvS4uX5w</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2468064314</pqid></control><display><type>article</type><title>Partial Identification of Answer Reviewing Effects in Multiple‐Choice Exams</title><source>Access via Wiley Online Library</source><source>Applied Social Sciences Index & Abstracts (ASSIA)</source><source>Education Source</source><source>Web of Science - Social Sciences Citation Index – 2020<img src="https://exlibris-pub.s3.amazonaws.com/fromwos-v2.jpg" /></source><creator>Kim, Yongnam</creator><creatorcontrib>Kim, Yongnam</creatorcontrib><description>Does reviewing previous answers during multiple‐choice exams help examinees increase their final score? This article formalizes the question using a rigorous causal framework, the potential outcomes framework. Viewing examinees’ reviewing status as a treatment and their final score as an outcome, the article first explains the challenges of identifying the causal effect of answer reviewing in regular exam‐taking settings. In addition to the incapability of randomizing the treatment selection (reviewing status) and the lack of other information to make this selection process ignorable, the treatment variable itself is not fully known to researchers. Looking at examinees’ answer sheet data, it is unclear whether an examinee who did not change his or her answer on a specific item reviewed it but retained the initial answer (treatment condition) or chose not to review it (control condition). Despite such challenges, however, the article develops partial identification strategies and shows that the sign of the answer reviewing effect can be reasonably inferred. By analyzing a statewide math assessment data set, the article finds that reviewing initial answers is generally beneficial for examinees.</description><identifier>ISSN: 0022-0655</identifier><identifier>EISSN: 1745-3984</identifier><identifier>DOI: 10.1111/jedm.12259</identifier><language>eng</language><publisher>HOBOKEN: Wiley</publisher><subject>Answer Sheets ; Answers ; Data Analysis ; Educational evaluation ; Educational Practices ; Educational tests & measurements ; Identification ; Identification (Psychology) ; Mathematics Tests ; Multiple Choice Tests ; Psychology ; Psychology, Applied ; Psychology, Educational ; Psychology, Mathematical ; Review (Reexamination) ; Scores ; Social Sciences ; Standardized tests ; Tests</subject><ispartof>Journal of educational measurement, 2020-12, Vol.57 (4), p.511-526</ispartof><rights>2019 by the National Council on Measurement in Education</rights><rights>2020 by the National Council on Measurement in Education</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>true</woscitedreferencessubscribed><woscitedreferencescount>3</woscitedreferencescount><woscitedreferencesoriginalsourcerecordid>wos000492462600001</woscitedreferencesoriginalsourcerecordid><citedby>FETCH-LOGICAL-c3239-e54a0c7fd8efbeafa6316cc2695d3dcd7860104a4c1dd12475e8ca74e954ac893</citedby><cites>FETCH-LOGICAL-c3239-e54a0c7fd8efbeafa6316cc2695d3dcd7860104a4c1dd12475e8ca74e954ac893</cites><orcidid>0000-0001-6731-7123</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://onlinelibrary.wiley.com/doi/pdf/10.1111%2Fjedm.12259$$EPDF$$P50$$Gwiley$$H</linktopdf><linktohtml>$$Uhttps://onlinelibrary.wiley.com/doi/full/10.1111%2Fjedm.12259$$EHTML$$P50$$Gwiley$$H</linktohtml><link.rule.ids>315,781,785,1418,27929,27930,28254,31004,45579,45580</link.rule.ids><backlink>$$Uhttp://eric.ed.gov/ERICWebPortal/detail?accno=EJ1277317$$DView record in ERIC$$Hfree_for_read</backlink></links><search><creatorcontrib>Kim, Yongnam</creatorcontrib><title>Partial Identification of Answer Reviewing Effects in Multiple‐Choice Exams</title><title>Journal of educational measurement</title><addtitle>J EDUC MEAS</addtitle><description>Does reviewing previous answers during multiple‐choice exams help examinees increase their final score? This article formalizes the question using a rigorous causal framework, the potential outcomes framework. Viewing examinees’ reviewing status as a treatment and their final score as an outcome, the article first explains the challenges of identifying the causal effect of answer reviewing in regular exam‐taking settings. In addition to the incapability of randomizing the treatment selection (reviewing status) and the lack of other information to make this selection process ignorable, the treatment variable itself is not fully known to researchers. Looking at examinees’ answer sheet data, it is unclear whether an examinee who did not change his or her answer on a specific item reviewed it but retained the initial answer (treatment condition) or chose not to review it (control condition). Despite such challenges, however, the article develops partial identification strategies and shows that the sign of the answer reviewing effect can be reasonably inferred. By analyzing a statewide math assessment data set, the article finds that reviewing initial answers is generally beneficial for examinees.</description><subject>Answer Sheets</subject><subject>Answers</subject><subject>Data Analysis</subject><subject>Educational evaluation</subject><subject>Educational Practices</subject><subject>Educational tests & measurements</subject><subject>Identification</subject><subject>Identification (Psychology)</subject><subject>Mathematics Tests</subject><subject>Multiple Choice Tests</subject><subject>Psychology</subject><subject>Psychology, Applied</subject><subject>Psychology, Educational</subject><subject>Psychology, Mathematical</subject><subject>Review (Reexamination)</subject><subject>Scores</subject><subject>Social Sciences</subject><subject>Standardized tests</subject><subject>Tests</subject><issn>0022-0655</issn><issn>1745-3984</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>ARHDP</sourceid><sourceid>7QJ</sourceid><recordid>eNqNkMtOwzAQRS0EEuWxYY8UiR0oYDt-JEsUwqNqBUKwjowzBldpUuyU0h2fwDfyJbgNYomYzYw0584dXYQOCD4loc4mUE1PCaU820ADIhmPkyxlm2iAMaUxFpxvox3vJxgTLjkZoPGdcp1VdXRTQdNZY7XqbNtErYnOG78AF93Dm4WFbZ6jwhjQnY9sE43ndWdnNXx9fOYvrdUQFe9q6vfQllG1h_2fvoseL4uH_Doe3V7d5OejWCc0yWLgTGEtTZWCeQJllEiI0JqKjFdJpSuZCkwwU0yTqiKUSQ6pVpJBFoQ6zZJddNTfnbn2dQ6-Kyft3DXBsqRMpFiwhLBAHfeUdq33Dkw5c3aq3LIkuFzFVa7iKtdxBfiwh8FZ_QsWQ0KlTIgM-7TfL-CpNV5baDT8chhjlgVrKsKESW67dYx5O2-6ID35vzTQ5Ie2NSz_eLkcFhfj_vlvS4uX5w</recordid><startdate>20201201</startdate><enddate>20201201</enddate><creator>Kim, Yongnam</creator><general>Wiley</general><general>Wiley Subscription Services, Inc</general><scope>17B</scope><scope>ARHDP</scope><scope>BLEPL</scope><scope>DVR</scope><scope>EGQ</scope><scope>7SW</scope><scope>BJH</scope><scope>BNH</scope><scope>BNI</scope><scope>BNJ</scope><scope>BNO</scope><scope>ERI</scope><scope>PET</scope><scope>REK</scope><scope>WWN</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7QJ</scope><orcidid>https://orcid.org/0000-0001-6731-7123</orcidid></search><sort><creationdate>20201201</creationdate><title>Partial Identification of Answer Reviewing Effects in Multiple‐Choice Exams</title><author>Kim, Yongnam</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c3239-e54a0c7fd8efbeafa6316cc2695d3dcd7860104a4c1dd12475e8ca74e954ac893</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Answer Sheets</topic><topic>Answers</topic><topic>Data Analysis</topic><topic>Educational evaluation</topic><topic>Educational Practices</topic><topic>Educational tests & measurements</topic><topic>Identification</topic><topic>Identification (Psychology)</topic><topic>Mathematics Tests</topic><topic>Multiple Choice Tests</topic><topic>Psychology</topic><topic>Psychology, Applied</topic><topic>Psychology, Educational</topic><topic>Psychology, Mathematical</topic><topic>Review (Reexamination)</topic><topic>Scores</topic><topic>Social Sciences</topic><topic>Standardized tests</topic><topic>Tests</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Kim, Yongnam</creatorcontrib><collection>Web of Knowledge</collection><collection>Web of Science - Social Sciences Citation Index – 2020</collection><collection>Web of Science Core Collection</collection><collection>Social Sciences Citation Index</collection><collection>Web of Science Primary (SCIE, SSCI & AHCI)</collection><collection>ERIC</collection><collection>ERIC (Ovid)</collection><collection>ERIC</collection><collection>ERIC</collection><collection>ERIC (Legacy Platform)</collection><collection>ERIC( SilverPlatter )</collection><collection>ERIC</collection><collection>ERIC PlusText (Legacy Platform)</collection><collection>Education Resources Information Center (ERIC)</collection><collection>ERIC</collection><collection>CrossRef</collection><collection>Applied Social Sciences Index & Abstracts (ASSIA)</collection><jtitle>Journal of educational measurement</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Kim, Yongnam</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><ericid>EJ1277317</ericid><atitle>Partial Identification of Answer Reviewing Effects in Multiple‐Choice Exams</atitle><jtitle>Journal of educational measurement</jtitle><stitle>J EDUC MEAS</stitle><date>2020-12-01</date><risdate>2020</risdate><volume>57</volume><issue>4</issue><spage>511</spage><epage>526</epage><pages>511-526</pages><issn>0022-0655</issn><eissn>1745-3984</eissn><abstract>Does reviewing previous answers during multiple‐choice exams help examinees increase their final score? This article formalizes the question using a rigorous causal framework, the potential outcomes framework. Viewing examinees’ reviewing status as a treatment and their final score as an outcome, the article first explains the challenges of identifying the causal effect of answer reviewing in regular exam‐taking settings. In addition to the incapability of randomizing the treatment selection (reviewing status) and the lack of other information to make this selection process ignorable, the treatment variable itself is not fully known to researchers. Looking at examinees’ answer sheet data, it is unclear whether an examinee who did not change his or her answer on a specific item reviewed it but retained the initial answer (treatment condition) or chose not to review it (control condition). Despite such challenges, however, the article develops partial identification strategies and shows that the sign of the answer reviewing effect can be reasonably inferred. By analyzing a statewide math assessment data set, the article finds that reviewing initial answers is generally beneficial for examinees.</abstract><cop>HOBOKEN</cop><pub>Wiley</pub><doi>10.1111/jedm.12259</doi><tpages>16</tpages><orcidid>https://orcid.org/0000-0001-6731-7123</orcidid></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0022-0655 |
ispartof | Journal of educational measurement, 2020-12, Vol.57 (4), p.511-526 |
issn | 0022-0655 1745-3984 |
language | eng |
recordid | cdi_webofscience_primary_000492462600001CitationCount |
source | Access via Wiley Online Library; Applied Social Sciences Index & Abstracts (ASSIA); Education Source; Web of Science - Social Sciences Citation Index – 2020<img src="https://exlibris-pub.s3.amazonaws.com/fromwos-v2.jpg" /> |
subjects | Answer Sheets Answers Data Analysis Educational evaluation Educational Practices Educational tests & measurements Identification Identification (Psychology) Mathematics Tests Multiple Choice Tests Psychology Psychology, Applied Psychology, Educational Psychology, Mathematical Review (Reexamination) Scores Social Sciences Standardized tests Tests |
title | Partial Identification of Answer Reviewing Effects in Multiple‐Choice Exams |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-12T15%3A45%3A20IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_webof&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Partial%20Identification%20of%20Answer%20Reviewing%20Effects%20in%20Multiple%E2%80%90Choice%20Exams&rft.jtitle=Journal%20of%20educational%20measurement&rft.au=Kim,%20Yongnam&rft.date=2020-12-01&rft.volume=57&rft.issue=4&rft.spage=511&rft.epage=526&rft.pages=511-526&rft.issn=0022-0655&rft.eissn=1745-3984&rft_id=info:doi/10.1111/jedm.12259&rft_dat=%3Cproquest_webof%3E2468064314%3C/proquest_webof%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2468064314&rft_id=info:pmid/&rft_ericid=EJ1277317&rfr_iscdi=true |