How Much Do Students' Scores in PISA Reflect General Intelligence and How Much Do They Reflect Specific Abilities?

International Large-Scale Assessments (LSA) allow comparisons of education systems' effectiveness in promoting student learning in specific domains, such as reading, mathematics, and science. However, it has been argued that students' scores in International LSAs mostly reflect general cog...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of educational psychology 2022-07, Vol.114 (5), p.1121-1135
Hauptverfasser: Pokropek, Artur, Marks, Gary N., Borgonovi, Francesca
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1135
container_issue 5
container_start_page 1121
container_title Journal of educational psychology
container_volume 114
creator Pokropek, Artur
Marks, Gary N.
Borgonovi, Francesca
description International Large-Scale Assessments (LSA) allow comparisons of education systems' effectiveness in promoting student learning in specific domains, such as reading, mathematics, and science. However, it has been argued that students' scores in International LSAs mostly reflect general cognitive ability (g). This study examines the extent to which students' scores in reading, mathematics, science, and a Raven's Progressive Matrices test reflect general ability g and domain-specific abilities with data from 3,472 Polish students who participated in the OECD's 2009 Programme for International Student Assessment (PISA) and who were retested with the same PISA instruments, but with a different item set, in 2010. Variance in students' responses to test items is explained better by with a bifactor Item Response Theory (IRT) model than by the multidimensional IRT model routinely used to scale PISA and other LSAs. The bifactor IRT model assumes that non-g factors (reading, math, science, and Raven's test) are uncorrelated with g and with each other. The bifactor model generates specific ability factors with more theoretically credible relationships with criterion variables than the multidimensional standard model. Further analyses of the bifactor model indicate that the domain-specific factors are not reliable enough to be interpreted meaningfully. They lie somewhere between unreliable measures of domain-specific abilities and nuisance factors reflecting measurement error. The finding that PISA achievement scores reflect mostly g, which may arise because PISA aims to test broad abilities in a variety of contexts or may be a general characteristic of LSAs and national achievement tests. Educational Impact and Implications StatementThis study analyzes Programme for International Student Assessment data from Poland to establish how much the achievement of secondary school students in reading, mathematics, science and in a Raven's Progressive Matrices test reflects general ability and how much it reflects domain-specific abilities. Findings indicate that a scaling model that accounts for general ability, fit the data better than models typically employed in large scale assessments that ignore the influence of general ability on student achievement. The finding that students' responses to PISA test items reflect general ability rather than domain-specific abilities, if replicated to other countries, could have important implications for the design of large-scale assess
doi_str_mv 10.1037/edu0000687
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2684650197</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ericid>EJ1372651</ericid><sourcerecordid>2579764966</sourcerecordid><originalsourceid>FETCH-LOGICAL-a383t-9e3ef7a2084d2acd788730581be8be5704239e036600dc4c07f48558877fe563</originalsourceid><addsrcrecordid>eNp9kMtLxDAQxoMouD4u3oWAB0Go5tE8epLF58qK4u49ZNOpG6ltTVpk_3uzrKgn5zDDML_5hvkQOqLknBKuLqAcSAqp1RYa0YIXGaNKbqMRIYxlREq-i_ZifEsMT80Ihfv2Ez8ObomvWzzrhxKaPp7imWsDROwb_DyZjfELVDW4Ht9BA8HWeNL0UNf-FRoH2DYl_qsyX8LqZ2PWgfOVd3i88LXvPcTLA7RT2TrC4XfdR_Pbm_nVfTZ9uptcjaeZ5Zr3WQEcKmUZ0XnJrCuV1ooToekC9AKEIjnjBazfIKR0uSOqyrUQiVIVCMn30clGtgvtxwCxN2_tEJp00TCpcykILdS_lFCFknkh11pnG8qFNsYAlemCf7dhZSgxa-PNr_EJPt7AELz7AW8eKFdMCprm2WZuO2u6uHI29N7VEN0QQrJ_rWUozY1ImVH-BW6ZjBc</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2579764966</pqid></control><display><type>article</type><title>How Much Do Students' Scores in PISA Reflect General Intelligence and How Much Do They Reflect Specific Abilities?</title><source>EBSCOhost APA PsycARTICLES</source><creator>Pokropek, Artur ; Marks, Gary N. ; Borgonovi, Francesca</creator><contributor>Kendeou, Panayiota</contributor><creatorcontrib>Pokropek, Artur ; Marks, Gary N. ; Borgonovi, Francesca ; Kendeou, Panayiota</creatorcontrib><description>International Large-Scale Assessments (LSA) allow comparisons of education systems' effectiveness in promoting student learning in specific domains, such as reading, mathematics, and science. However, it has been argued that students' scores in International LSAs mostly reflect general cognitive ability (g). This study examines the extent to which students' scores in reading, mathematics, science, and a Raven's Progressive Matrices test reflect general ability g and domain-specific abilities with data from 3,472 Polish students who participated in the OECD's 2009 Programme for International Student Assessment (PISA) and who were retested with the same PISA instruments, but with a different item set, in 2010. Variance in students' responses to test items is explained better by with a bifactor Item Response Theory (IRT) model than by the multidimensional IRT model routinely used to scale PISA and other LSAs. The bifactor IRT model assumes that non-g factors (reading, math, science, and Raven's test) are uncorrelated with g and with each other. The bifactor model generates specific ability factors with more theoretically credible relationships with criterion variables than the multidimensional standard model. Further analyses of the bifactor model indicate that the domain-specific factors are not reliable enough to be interpreted meaningfully. They lie somewhere between unreliable measures of domain-specific abilities and nuisance factors reflecting measurement error. The finding that PISA achievement scores reflect mostly g, which may arise because PISA aims to test broad abilities in a variety of contexts or may be a general characteristic of LSAs and national achievement tests. Educational Impact and Implications StatementThis study analyzes Programme for International Student Assessment data from Poland to establish how much the achievement of secondary school students in reading, mathematics, science and in a Raven's Progressive Matrices test reflects general ability and how much it reflects domain-specific abilities. Findings indicate that a scaling model that accounts for general ability, fit the data better than models typically employed in large scale assessments that ignore the influence of general ability on student achievement. The finding that students' responses to PISA test items reflect general ability rather than domain-specific abilities, if replicated to other countries, could have important implications for the design of large-scale assessments and the interpretation of analyses of large-scale assessment data.</description><identifier>ISSN: 0022-0663</identifier><identifier>EISSN: 1939-2176</identifier><identifier>DOI: 10.1037/edu0000687</identifier><language>eng</language><publisher>Washington: American Psychological Association</publisher><subject>Academic grading ; Achievement Tests ; Cognitive Ability ; Educational Measures ; Female ; Foreign Countries ; Human ; Intelligence ; International Assessment ; International Students ; Item Response Theory ; Male ; Mathematical Ability ; Mathematics ; Mathematics Achievement ; Mathematics education ; Reading ; Reading Ability ; Reading Achievement ; Science Achievement ; Science Education ; Scores ; Secondary School Students</subject><ispartof>Journal of educational psychology, 2022-07, Vol.114 (5), p.1121-1135</ispartof><rights>2021 American Psychological Association</rights><rights>2021, American Psychological Association</rights><rights>Copyright American Psychological Association Jul 2022</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-a383t-9e3ef7a2084d2acd788730581be8be5704239e036600dc4c07f48558877fe563</citedby><orcidid>0000-0002-7380-5243 ; 0000-0002-6759-4515 ; 0000-0002-5899-2917</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27924,27925</link.rule.ids><backlink>$$Uhttp://eric.ed.gov/ERICWebPortal/detail?accno=EJ1372651$$DView record in ERIC$$Hfree_for_read</backlink></links><search><contributor>Kendeou, Panayiota</contributor><creatorcontrib>Pokropek, Artur</creatorcontrib><creatorcontrib>Marks, Gary N.</creatorcontrib><creatorcontrib>Borgonovi, Francesca</creatorcontrib><title>How Much Do Students' Scores in PISA Reflect General Intelligence and How Much Do They Reflect Specific Abilities?</title><title>Journal of educational psychology</title><description>International Large-Scale Assessments (LSA) allow comparisons of education systems' effectiveness in promoting student learning in specific domains, such as reading, mathematics, and science. However, it has been argued that students' scores in International LSAs mostly reflect general cognitive ability (g). This study examines the extent to which students' scores in reading, mathematics, science, and a Raven's Progressive Matrices test reflect general ability g and domain-specific abilities with data from 3,472 Polish students who participated in the OECD's 2009 Programme for International Student Assessment (PISA) and who were retested with the same PISA instruments, but with a different item set, in 2010. Variance in students' responses to test items is explained better by with a bifactor Item Response Theory (IRT) model than by the multidimensional IRT model routinely used to scale PISA and other LSAs. The bifactor IRT model assumes that non-g factors (reading, math, science, and Raven's test) are uncorrelated with g and with each other. The bifactor model generates specific ability factors with more theoretically credible relationships with criterion variables than the multidimensional standard model. Further analyses of the bifactor model indicate that the domain-specific factors are not reliable enough to be interpreted meaningfully. They lie somewhere between unreliable measures of domain-specific abilities and nuisance factors reflecting measurement error. The finding that PISA achievement scores reflect mostly g, which may arise because PISA aims to test broad abilities in a variety of contexts or may be a general characteristic of LSAs and national achievement tests. Educational Impact and Implications StatementThis study analyzes Programme for International Student Assessment data from Poland to establish how much the achievement of secondary school students in reading, mathematics, science and in a Raven's Progressive Matrices test reflects general ability and how much it reflects domain-specific abilities. Findings indicate that a scaling model that accounts for general ability, fit the data better than models typically employed in large scale assessments that ignore the influence of general ability on student achievement. The finding that students' responses to PISA test items reflect general ability rather than domain-specific abilities, if replicated to other countries, could have important implications for the design of large-scale assessments and the interpretation of analyses of large-scale assessment data.</description><subject>Academic grading</subject><subject>Achievement Tests</subject><subject>Cognitive Ability</subject><subject>Educational Measures</subject><subject>Female</subject><subject>Foreign Countries</subject><subject>Human</subject><subject>Intelligence</subject><subject>International Assessment</subject><subject>International Students</subject><subject>Item Response Theory</subject><subject>Male</subject><subject>Mathematical Ability</subject><subject>Mathematics</subject><subject>Mathematics Achievement</subject><subject>Mathematics education</subject><subject>Reading</subject><subject>Reading Ability</subject><subject>Reading Achievement</subject><subject>Science Achievement</subject><subject>Science Education</subject><subject>Scores</subject><subject>Secondary School Students</subject><issn>0022-0663</issn><issn>1939-2176</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><recordid>eNp9kMtLxDAQxoMouD4u3oWAB0Go5tE8epLF58qK4u49ZNOpG6ltTVpk_3uzrKgn5zDDML_5hvkQOqLknBKuLqAcSAqp1RYa0YIXGaNKbqMRIYxlREq-i_ZifEsMT80Ihfv2Ez8ObomvWzzrhxKaPp7imWsDROwb_DyZjfELVDW4Ht9BA8HWeNL0UNf-FRoH2DYl_qsyX8LqZ2PWgfOVd3i88LXvPcTLA7RT2TrC4XfdR_Pbm_nVfTZ9uptcjaeZ5Zr3WQEcKmUZ0XnJrCuV1ooToekC9AKEIjnjBazfIKR0uSOqyrUQiVIVCMn30clGtgvtxwCxN2_tEJp00TCpcykILdS_lFCFknkh11pnG8qFNsYAlemCf7dhZSgxa-PNr_EJPt7AELz7AW8eKFdMCprm2WZuO2u6uHI29N7VEN0QQrJ_rWUozY1ImVH-BW6ZjBc</recordid><startdate>20220701</startdate><enddate>20220701</enddate><creator>Pokropek, Artur</creator><creator>Marks, Gary N.</creator><creator>Borgonovi, Francesca</creator><general>American Psychological Association</general><scope>7SW</scope><scope>BJH</scope><scope>BNH</scope><scope>BNI</scope><scope>BNJ</scope><scope>BNO</scope><scope>ERI</scope><scope>PET</scope><scope>REK</scope><scope>WWN</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7RZ</scope><scope>PSYQQ</scope><orcidid>https://orcid.org/0000-0002-7380-5243</orcidid><orcidid>https://orcid.org/0000-0002-6759-4515</orcidid><orcidid>https://orcid.org/0000-0002-5899-2917</orcidid></search><sort><creationdate>20220701</creationdate><title>How Much Do Students' Scores in PISA Reflect General Intelligence and How Much Do They Reflect Specific Abilities?</title><author>Pokropek, Artur ; Marks, Gary N. ; Borgonovi, Francesca</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a383t-9e3ef7a2084d2acd788730581be8be5704239e036600dc4c07f48558877fe563</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Academic grading</topic><topic>Achievement Tests</topic><topic>Cognitive Ability</topic><topic>Educational Measures</topic><topic>Female</topic><topic>Foreign Countries</topic><topic>Human</topic><topic>Intelligence</topic><topic>International Assessment</topic><topic>International Students</topic><topic>Item Response Theory</topic><topic>Male</topic><topic>Mathematical Ability</topic><topic>Mathematics</topic><topic>Mathematics Achievement</topic><topic>Mathematics education</topic><topic>Reading</topic><topic>Reading Ability</topic><topic>Reading Achievement</topic><topic>Science Achievement</topic><topic>Science Education</topic><topic>Scores</topic><topic>Secondary School Students</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Pokropek, Artur</creatorcontrib><creatorcontrib>Marks, Gary N.</creatorcontrib><creatorcontrib>Borgonovi, Francesca</creatorcontrib><collection>ERIC</collection><collection>ERIC (Ovid)</collection><collection>ERIC</collection><collection>ERIC</collection><collection>ERIC (Legacy Platform)</collection><collection>ERIC( SilverPlatter )</collection><collection>ERIC</collection><collection>ERIC PlusText (Legacy Platform)</collection><collection>Education Resources Information Center (ERIC)</collection><collection>ERIC</collection><collection>CrossRef</collection><collection>Access via APA PsycArticles® (ProQuest)</collection><collection>ProQuest One Psychology</collection><jtitle>Journal of educational psychology</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Pokropek, Artur</au><au>Marks, Gary N.</au><au>Borgonovi, Francesca</au><au>Kendeou, Panayiota</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><ericid>EJ1372651</ericid><atitle>How Much Do Students' Scores in PISA Reflect General Intelligence and How Much Do They Reflect Specific Abilities?</atitle><jtitle>Journal of educational psychology</jtitle><date>2022-07-01</date><risdate>2022</risdate><volume>114</volume><issue>5</issue><spage>1121</spage><epage>1135</epage><pages>1121-1135</pages><issn>0022-0663</issn><eissn>1939-2176</eissn><abstract>International Large-Scale Assessments (LSA) allow comparisons of education systems' effectiveness in promoting student learning in specific domains, such as reading, mathematics, and science. However, it has been argued that students' scores in International LSAs mostly reflect general cognitive ability (g). This study examines the extent to which students' scores in reading, mathematics, science, and a Raven's Progressive Matrices test reflect general ability g and domain-specific abilities with data from 3,472 Polish students who participated in the OECD's 2009 Programme for International Student Assessment (PISA) and who were retested with the same PISA instruments, but with a different item set, in 2010. Variance in students' responses to test items is explained better by with a bifactor Item Response Theory (IRT) model than by the multidimensional IRT model routinely used to scale PISA and other LSAs. The bifactor IRT model assumes that non-g factors (reading, math, science, and Raven's test) are uncorrelated with g and with each other. The bifactor model generates specific ability factors with more theoretically credible relationships with criterion variables than the multidimensional standard model. Further analyses of the bifactor model indicate that the domain-specific factors are not reliable enough to be interpreted meaningfully. They lie somewhere between unreliable measures of domain-specific abilities and nuisance factors reflecting measurement error. The finding that PISA achievement scores reflect mostly g, which may arise because PISA aims to test broad abilities in a variety of contexts or may be a general characteristic of LSAs and national achievement tests. Educational Impact and Implications StatementThis study analyzes Programme for International Student Assessment data from Poland to establish how much the achievement of secondary school students in reading, mathematics, science and in a Raven's Progressive Matrices test reflects general ability and how much it reflects domain-specific abilities. Findings indicate that a scaling model that accounts for general ability, fit the data better than models typically employed in large scale assessments that ignore the influence of general ability on student achievement. The finding that students' responses to PISA test items reflect general ability rather than domain-specific abilities, if replicated to other countries, could have important implications for the design of large-scale assessments and the interpretation of analyses of large-scale assessment data.</abstract><cop>Washington</cop><pub>American Psychological Association</pub><doi>10.1037/edu0000687</doi><tpages>15</tpages><orcidid>https://orcid.org/0000-0002-7380-5243</orcidid><orcidid>https://orcid.org/0000-0002-6759-4515</orcidid><orcidid>https://orcid.org/0000-0002-5899-2917</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 0022-0663
ispartof Journal of educational psychology, 2022-07, Vol.114 (5), p.1121-1135
issn 0022-0663
1939-2176
language eng
recordid cdi_proquest_journals_2684650197
source EBSCOhost APA PsycARTICLES
subjects Academic grading
Achievement Tests
Cognitive Ability
Educational Measures
Female
Foreign Countries
Human
Intelligence
International Assessment
International Students
Item Response Theory
Male
Mathematical Ability
Mathematics
Mathematics Achievement
Mathematics education
Reading
Reading Ability
Reading Achievement
Science Achievement
Science Education
Scores
Secondary School Students
title How Much Do Students' Scores in PISA Reflect General Intelligence and How Much Do They Reflect Specific Abilities?
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-26T03%3A22%3A51IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=How%20Much%20Do%20Students'%20Scores%20in%20PISA%20Reflect%20General%20Intelligence%20and%20How%20Much%20Do%20They%20Reflect%20Specific%20Abilities?&rft.jtitle=Journal%20of%20educational%20psychology&rft.au=Pokropek,%20Artur&rft.date=2022-07-01&rft.volume=114&rft.issue=5&rft.spage=1121&rft.epage=1135&rft.pages=1121-1135&rft.issn=0022-0663&rft.eissn=1939-2176&rft_id=info:doi/10.1037/edu0000687&rft_dat=%3Cproquest_cross%3E2579764966%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2579764966&rft_id=info:pmid/&rft_ericid=EJ1372651&rfr_iscdi=true