Automated scoring in context: Rapid assessment for placed students

► Applies a new evaluation framework for automated essay scoring (AES). ► Proposes a new use for AES: rapid assessment of placed students in first-year classes. ► Offers results on relationships between AES and established writing measures. This study investigated the use of automated essay scoring...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Assessing writing 2013-01, Vol.18 (1), p.62-84
Hauptverfasser: Klobucar, Andrew, Elliot, Norbert, Deess, Perry, Rudniy, Oleksandr, Joshi, Kamal
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 84
container_issue 1
container_start_page 62
container_title Assessing writing
container_volume 18
creator Klobucar, Andrew
Elliot, Norbert
Deess, Perry
Rudniy, Oleksandr
Joshi, Kamal
description ► Applies a new evaluation framework for automated essay scoring (AES). ► Proposes a new use for AES: rapid assessment of placed students in first-year classes. ► Offers results on relationships between AES and established writing measures. This study investigated the use of automated essay scoring (AES) to identify at-risk students enrolled in a first-year university writing course. An application of AES, the Criterion® Online Writing Evaluation Service was evaluated through a methodology focusing on construct modelling, response processes, disaggregation, extrapolation, generalization, and consequence. Based on the results of our two-year study with students (N=1,482) at a public technological research university in the United States, we found that Criterion offered a defined writing construct congruent with established models, achieved acceptance among students and instructors, showed no statistically significant differences between ethnicity groups of sufficient sample size, correlated at acceptable levels with other writing measures, performed in a stable fashion, and enabled instructors to identify at-risk students to increase their course success.
doi_str_mv 10.1016/j.asw.2012.10.001
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_1550993239</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ericid>EJ995504</ericid><els_id>S107529351200044X</els_id><sourcerecordid>1550993239</sourcerecordid><originalsourceid>FETCH-LOGICAL-c351t-cfccfac956119789090810ebf3079989256e81043a4f9221b4fd6067b69d388e3</originalsourceid><addsrcrecordid>eNp9kE1PxCAURYnRxHH0B5i46NJNK5TSFl2Nk_Erk5gYXROGPgyTtlSgfvx7aWbi0hVw33k34SB0TnBGMCmvtpn0X1mOSR7fGcbkAM1IXdGUcVIexjuuWJpzyo7RiffbCBSY0hm6XYzBdjJAk3hlnenfE9MnyvYBvsN18iIH0yTSe_C-gz4k2rpkaKWa-DA2MfKn6EjL1sPZ_pyjt7vV6_IhXT_fPy4X61RRRkKqtFJaKs5KQnhVc8xxTTBsNMUV5zXPWQkxKKgsNM9zsil0U-Ky2pS8oXUNdI4ud72Dsx8j-CA64xW0rezBjl4QxjDnNKc8omSHKme9d6DF4Ewn3Y8gWEy-xFZEX2LyNUVRR9y52O2AM-qPXz1xHnuLOL7Zj-MXPw044ZWBPoowDlQQjTX_lP8CHNZ7Qw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1550993239</pqid></control><display><type>article</type><title>Automated scoring in context: Rapid assessment for placed students</title><source>Elsevier ScienceDirect Journals</source><creator>Klobucar, Andrew ; Elliot, Norbert ; Deess, Perry ; Rudniy, Oleksandr ; Joshi, Kamal</creator><creatorcontrib>Klobucar, Andrew ; Elliot, Norbert ; Deess, Perry ; Rudniy, Oleksandr ; Joshi, Kamal</creatorcontrib><description>► Applies a new evaluation framework for automated essay scoring (AES). ► Proposes a new use for AES: rapid assessment of placed students in first-year classes. ► Offers results on relationships between AES and established writing measures. This study investigated the use of automated essay scoring (AES) to identify at-risk students enrolled in a first-year university writing course. An application of AES, the Criterion® Online Writing Evaluation Service was evaluated through a methodology focusing on construct modelling, response processes, disaggregation, extrapolation, generalization, and consequence. Based on the results of our two-year study with students (N=1,482) at a public technological research university in the United States, we found that Criterion offered a defined writing construct congruent with established models, achieved acceptance among students and instructors, showed no statistically significant differences between ethnicity groups of sufficient sample size, correlated at acceptable levels with other writing measures, performed in a stable fashion, and enabled instructors to identify at-risk students to increase their course success.</description><identifier>ISSN: 1075-2935</identifier><identifier>EISSN: 1873-5916</identifier><identifier>DOI: 10.1016/j.asw.2012.10.001</identifier><identifier>CODEN: ASWRFM</identifier><language>eng</language><publisher>Elsevier Inc</publisher><subject>At Risk Students ; Automated essay scoring (AES) ; Automation ; College Freshmen ; Computer Assisted Testing ; Computer Software Evaluation ; Essay Tests ; Essays ; Ethnic Groups ; Research Universities ; Scoring ; Statistical Analysis ; Student Placement ; Validation methods ; Writing assessment ; Writing Evaluation ; Writing Instruction ; Writing placement</subject><ispartof>Assessing writing, 2013-01, Vol.18 (1), p.62-84</ispartof><rights>2012</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c351t-cfccfac956119789090810ebf3079989256e81043a4f9221b4fd6067b69d388e3</citedby><cites>FETCH-LOGICAL-c351t-cfccfac956119789090810ebf3079989256e81043a4f9221b4fd6067b69d388e3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.sciencedirect.com/science/article/pii/S107529351200044X$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,776,780,3537,27901,27902,65534</link.rule.ids><backlink>$$Uhttp://eric.ed.gov/ERICWebPortal/detail?accno=EJ995504$$DView record in ERIC$$Hfree_for_read</backlink></links><search><creatorcontrib>Klobucar, Andrew</creatorcontrib><creatorcontrib>Elliot, Norbert</creatorcontrib><creatorcontrib>Deess, Perry</creatorcontrib><creatorcontrib>Rudniy, Oleksandr</creatorcontrib><creatorcontrib>Joshi, Kamal</creatorcontrib><title>Automated scoring in context: Rapid assessment for placed students</title><title>Assessing writing</title><description>► Applies a new evaluation framework for automated essay scoring (AES). ► Proposes a new use for AES: rapid assessment of placed students in first-year classes. ► Offers results on relationships between AES and established writing measures. This study investigated the use of automated essay scoring (AES) to identify at-risk students enrolled in a first-year university writing course. An application of AES, the Criterion® Online Writing Evaluation Service was evaluated through a methodology focusing on construct modelling, response processes, disaggregation, extrapolation, generalization, and consequence. Based on the results of our two-year study with students (N=1,482) at a public technological research university in the United States, we found that Criterion offered a defined writing construct congruent with established models, achieved acceptance among students and instructors, showed no statistically significant differences between ethnicity groups of sufficient sample size, correlated at acceptable levels with other writing measures, performed in a stable fashion, and enabled instructors to identify at-risk students to increase their course success.</description><subject>At Risk Students</subject><subject>Automated essay scoring (AES)</subject><subject>Automation</subject><subject>College Freshmen</subject><subject>Computer Assisted Testing</subject><subject>Computer Software Evaluation</subject><subject>Essay Tests</subject><subject>Essays</subject><subject>Ethnic Groups</subject><subject>Research Universities</subject><subject>Scoring</subject><subject>Statistical Analysis</subject><subject>Student Placement</subject><subject>Validation methods</subject><subject>Writing assessment</subject><subject>Writing Evaluation</subject><subject>Writing Instruction</subject><subject>Writing placement</subject><issn>1075-2935</issn><issn>1873-5916</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2013</creationdate><recordtype>article</recordtype><recordid>eNp9kE1PxCAURYnRxHH0B5i46NJNK5TSFl2Nk_Erk5gYXROGPgyTtlSgfvx7aWbi0hVw33k34SB0TnBGMCmvtpn0X1mOSR7fGcbkAM1IXdGUcVIexjuuWJpzyo7RiffbCBSY0hm6XYzBdjJAk3hlnenfE9MnyvYBvsN18iIH0yTSe_C-gz4k2rpkaKWa-DA2MfKn6EjL1sPZ_pyjt7vV6_IhXT_fPy4X61RRRkKqtFJaKs5KQnhVc8xxTTBsNMUV5zXPWQkxKKgsNM9zsil0U-Ky2pS8oXUNdI4ud72Dsx8j-CA64xW0rezBjl4QxjDnNKc8omSHKme9d6DF4Ewn3Y8gWEy-xFZEX2LyNUVRR9y52O2AM-qPXz1xHnuLOL7Zj-MXPw044ZWBPoowDlQQjTX_lP8CHNZ7Qw</recordid><startdate>201301</startdate><enddate>201301</enddate><creator>Klobucar, Andrew</creator><creator>Elliot, Norbert</creator><creator>Deess, Perry</creator><creator>Rudniy, Oleksandr</creator><creator>Joshi, Kamal</creator><general>Elsevier Inc</general><general>Elsevier</general><scope>7SW</scope><scope>BJH</scope><scope>BNH</scope><scope>BNI</scope><scope>BNJ</scope><scope>BNO</scope><scope>ERI</scope><scope>PET</scope><scope>REK</scope><scope>WWN</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7T9</scope></search><sort><creationdate>201301</creationdate><title>Automated scoring in context: Rapid assessment for placed students</title><author>Klobucar, Andrew ; Elliot, Norbert ; Deess, Perry ; Rudniy, Oleksandr ; Joshi, Kamal</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c351t-cfccfac956119789090810ebf3079989256e81043a4f9221b4fd6067b69d388e3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2013</creationdate><topic>At Risk Students</topic><topic>Automated essay scoring (AES)</topic><topic>Automation</topic><topic>College Freshmen</topic><topic>Computer Assisted Testing</topic><topic>Computer Software Evaluation</topic><topic>Essay Tests</topic><topic>Essays</topic><topic>Ethnic Groups</topic><topic>Research Universities</topic><topic>Scoring</topic><topic>Statistical Analysis</topic><topic>Student Placement</topic><topic>Validation methods</topic><topic>Writing assessment</topic><topic>Writing Evaluation</topic><topic>Writing Instruction</topic><topic>Writing placement</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Klobucar, Andrew</creatorcontrib><creatorcontrib>Elliot, Norbert</creatorcontrib><creatorcontrib>Deess, Perry</creatorcontrib><creatorcontrib>Rudniy, Oleksandr</creatorcontrib><creatorcontrib>Joshi, Kamal</creatorcontrib><collection>ERIC</collection><collection>ERIC (Ovid)</collection><collection>ERIC</collection><collection>ERIC</collection><collection>ERIC (Legacy Platform)</collection><collection>ERIC( SilverPlatter )</collection><collection>ERIC</collection><collection>ERIC PlusText (Legacy Platform)</collection><collection>Education Resources Information Center (ERIC)</collection><collection>ERIC</collection><collection>CrossRef</collection><collection>Linguistics and Language Behavior Abstracts (LLBA)</collection><jtitle>Assessing writing</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Klobucar, Andrew</au><au>Elliot, Norbert</au><au>Deess, Perry</au><au>Rudniy, Oleksandr</au><au>Joshi, Kamal</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><ericid>EJ995504</ericid><atitle>Automated scoring in context: Rapid assessment for placed students</atitle><jtitle>Assessing writing</jtitle><date>2013-01</date><risdate>2013</risdate><volume>18</volume><issue>1</issue><spage>62</spage><epage>84</epage><pages>62-84</pages><issn>1075-2935</issn><eissn>1873-5916</eissn><coden>ASWRFM</coden><abstract>► Applies a new evaluation framework for automated essay scoring (AES). ► Proposes a new use for AES: rapid assessment of placed students in first-year classes. ► Offers results on relationships between AES and established writing measures. This study investigated the use of automated essay scoring (AES) to identify at-risk students enrolled in a first-year university writing course. An application of AES, the Criterion® Online Writing Evaluation Service was evaluated through a methodology focusing on construct modelling, response processes, disaggregation, extrapolation, generalization, and consequence. Based on the results of our two-year study with students (N=1,482) at a public technological research university in the United States, we found that Criterion offered a defined writing construct congruent with established models, achieved acceptance among students and instructors, showed no statistically significant differences between ethnicity groups of sufficient sample size, correlated at acceptable levels with other writing measures, performed in a stable fashion, and enabled instructors to identify at-risk students to increase their course success.</abstract><pub>Elsevier Inc</pub><doi>10.1016/j.asw.2012.10.001</doi><tpages>23</tpages></addata></record>
fulltext fulltext
identifier ISSN: 1075-2935
ispartof Assessing writing, 2013-01, Vol.18 (1), p.62-84
issn 1075-2935
1873-5916
language eng
recordid cdi_proquest_miscellaneous_1550993239
source Elsevier ScienceDirect Journals
subjects At Risk Students
Automated essay scoring (AES)
Automation
College Freshmen
Computer Assisted Testing
Computer Software Evaluation
Essay Tests
Essays
Ethnic Groups
Research Universities
Scoring
Statistical Analysis
Student Placement
Validation methods
Writing assessment
Writing Evaluation
Writing Instruction
Writing placement
title Automated scoring in context: Rapid assessment for placed students
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-13T14%3A07%3A03IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Automated%20scoring%20in%20context:%20Rapid%20assessment%20for%20placed%20students&rft.jtitle=Assessing%20writing&rft.au=Klobucar,%20Andrew&rft.date=2013-01&rft.volume=18&rft.issue=1&rft.spage=62&rft.epage=84&rft.pages=62-84&rft.issn=1075-2935&rft.eissn=1873-5916&rft.coden=ASWRFM&rft_id=info:doi/10.1016/j.asw.2012.10.001&rft_dat=%3Cproquest_cross%3E1550993239%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1550993239&rft_id=info:pmid/&rft_ericid=EJ995504&rft_els_id=S107529351200044X&rfr_iscdi=true