Leveraging natural language processing to support automated assessment and feedback for student open responses in mathematics
Background Teachers often rely on the use of open‐ended questions to assess students' conceptual understanding of assigned content. Particularly in the context of mathematics; teachers use these types of questions to gain insight into the processes and strategies adopted by students in solving...
Gespeichert in:
Veröffentlicht in: | Journal of computer assisted learning 2023-06, Vol.39 (3), p.823-840 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 840 |
---|---|
container_issue | 3 |
container_start_page | 823 |
container_title | Journal of computer assisted learning |
container_volume | 39 |
creator | Botelho, Anthony Baral, Sami Erickson, John A. Benachamardi, Priyanka Heffernan, Neil T. |
description | Background
Teachers often rely on the use of open‐ended questions to assess students' conceptual understanding of assigned content. Particularly in the context of mathematics; teachers use these types of questions to gain insight into the processes and strategies adopted by students in solving mathematical problems beyond what is possible through more close‐ended problem types. While these types of problems are valuable to teachers, the variation in student responses to these questions makes it difficult, and time‐consuming, to evaluate and provide directed feedback. It is a well‐studied concept that feedback, both in terms of a numeric score but more importantly in the form of teacher‐authored comments, can help guide students as to how to improve, leading to increased learning. It is for this reason that teachers need better support not only for assessing students' work but also in providing meaningful and directed feedback to students.
Objectives
In this paper, we seek to develop, evaluate, and examine machine learning models that support automated open response assessment and feedback.
Methods
We build upon the prior research in the automatic assessment of student responses to open‐ended problems and introduce a novel approach that leverages student log data combined with machine learning and natural language processing methods. Utilizing sentence‐level semantic representations of student responses to open‐ended questions, we propose a collaborative filtering‐based approach to both predict student scores as well as recommend appropriate feedback messages for teachers to send to their students.
Results and Conclusion
We find that our method outperforms previously published benchmarks across three different metrics for the task of predicting student performance. Through an error analysis, we identify several areas where future works may be able to improve upon our approach.
Lay Description
What is already known about this topic
Open‐ended questions are used by teachers in the domain of mathematics to assess their students understanding but automated support for these types of questions are limited in online learning platforms. Recent advancements in areas of machine learning and natural language processing have led to promising results in a range of domains and applications.
What this paper adds
Emulating how teachers identify similar student answers can be used to build tools that support teachers in assessing and providing feedback to student open‐ende |
doi_str_mv | 10.1111/jcal.12793 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2817946358</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ericid>EJ1378575</ericid><sourcerecordid>2817946358</sourcerecordid><originalsourceid>FETCH-LOGICAL-c3593-50d596b3f747e03facaa473aa94c009b57c70a6c12a0bfd3b569827a8fd9fbe03</originalsourceid><addsrcrecordid>eNp9kM1r3DAQxUVJoJtNL70HBL0VvJUsy7KOYclHl4Ve0rMZy6OtN17JkeyWPeR_jxyXHjsHDej95g3zCPnM2Yan-nY00G94rrT4QFZclDLLVa4vyIrlZZkVmumP5CrGI2NM6bJakdc9_sYAh84dqINxCtDTHtxhggPSIXiDMc7a6GmchsGHkcI0-hOM2FKIMckndOnTtdQitg2YZ2p9oHGc2lnwAzoaMA7eJZh2jqbZX5iezsRrcmmhj_jpb1-Tn_d3T9vHbP_j4fv2dp8ZIbXIJGulLhthVaGQCQsGoFACQBeGMd1IZRSD0vAcWGNb0chSV7mCyrbaNmliTb4svumilwnjWB_9FFxaWecVV7oohawS9XWhTPAxBrT1ELoThHPNWT3HW8_x1u_xJvhmgTF05h94t-NCVVLJpPNF_9P1eP6PU71Lhy6eb1yeiec</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2817946358</pqid></control><display><type>article</type><title>Leveraging natural language processing to support automated assessment and feedback for student open responses in mathematics</title><source>Access via Wiley Online Library</source><creator>Botelho, Anthony ; Baral, Sami ; Erickson, John A. ; Benachamardi, Priyanka ; Heffernan, Neil T.</creator><creatorcontrib>Botelho, Anthony ; Baral, Sami ; Erickson, John A. ; Benachamardi, Priyanka ; Heffernan, Neil T.</creatorcontrib><description>Background
Teachers often rely on the use of open‐ended questions to assess students' conceptual understanding of assigned content. Particularly in the context of mathematics; teachers use these types of questions to gain insight into the processes and strategies adopted by students in solving mathematical problems beyond what is possible through more close‐ended problem types. While these types of problems are valuable to teachers, the variation in student responses to these questions makes it difficult, and time‐consuming, to evaluate and provide directed feedback. It is a well‐studied concept that feedback, both in terms of a numeric score but more importantly in the form of teacher‐authored comments, can help guide students as to how to improve, leading to increased learning. It is for this reason that teachers need better support not only for assessing students' work but also in providing meaningful and directed feedback to students.
Objectives
In this paper, we seek to develop, evaluate, and examine machine learning models that support automated open response assessment and feedback.
Methods
We build upon the prior research in the automatic assessment of student responses to open‐ended problems and introduce a novel approach that leverages student log data combined with machine learning and natural language processing methods. Utilizing sentence‐level semantic representations of student responses to open‐ended questions, we propose a collaborative filtering‐based approach to both predict student scores as well as recommend appropriate feedback messages for teachers to send to their students.
Results and Conclusion
We find that our method outperforms previously published benchmarks across three different metrics for the task of predicting student performance. Through an error analysis, we identify several areas where future works may be able to improve upon our approach.
Lay Description
What is already known about this topic
Open‐ended questions are used by teachers in the domain of mathematics to assess their students understanding but automated support for these types of questions are limited in online learning platforms. Recent advancements in areas of machine learning and natural language processing have led to promising results in a range of domains and applications.
What this paper adds
Emulating how teachers identify similar student answers can be used to build tools that support teachers in assessing and providing feedback to student open‐ended work.
Implications for practice
Developing better automated supports can increase the amount of direct feedback students receive to guide their learning.</description><identifier>ISSN: 0266-4909</identifier><identifier>EISSN: 1365-2729</identifier><identifier>DOI: 10.1111/jcal.12793</identifier><language>eng</language><publisher>Chichester, UK: John Wiley & Sons, Inc</publisher><subject>Academic Achievement ; Artificial Intelligence ; automated assessment ; Automation ; Computer Assisted Testing ; Distance learning ; Domains ; Electronic Learning ; Error analysis ; Error Analysis (Language) ; Feedback ; Feedback (Response) ; feedback recommendation ; Language Processing ; Machine learning ; Mathematical analysis ; Mathematics ; Mathematics Achievement ; Mathematics Education ; Mathematics Tests ; Natural Language Processing ; open responses ; Performance prediction ; Prediction ; Questions ; Semantics ; sentence embeddings ; similarity ; Student Evaluation ; Student Journals ; Students ; Teachers ; Teaching Methods ; Test Format</subject><ispartof>Journal of computer assisted learning, 2023-06, Vol.39 (3), p.823-840</ispartof><rights>2023 John Wiley & Sons Ltd.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c3593-50d596b3f747e03facaa473aa94c009b57c70a6c12a0bfd3b569827a8fd9fbe03</citedby><cites>FETCH-LOGICAL-c3593-50d596b3f747e03facaa473aa94c009b57c70a6c12a0bfd3b569827a8fd9fbe03</cites><orcidid>0000-0002-6185-5841</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://onlinelibrary.wiley.com/doi/pdf/10.1111%2Fjcal.12793$$EPDF$$P50$$Gwiley$$H</linktopdf><linktohtml>$$Uhttps://onlinelibrary.wiley.com/doi/full/10.1111%2Fjcal.12793$$EHTML$$P50$$Gwiley$$H</linktohtml><link.rule.ids>315,781,785,1418,27929,27930,45579,45580</link.rule.ids><backlink>$$Uhttp://eric.ed.gov/ERICWebPortal/detail?accno=EJ1378575$$DView record in ERIC$$Hfree_for_read</backlink></links><search><creatorcontrib>Botelho, Anthony</creatorcontrib><creatorcontrib>Baral, Sami</creatorcontrib><creatorcontrib>Erickson, John A.</creatorcontrib><creatorcontrib>Benachamardi, Priyanka</creatorcontrib><creatorcontrib>Heffernan, Neil T.</creatorcontrib><title>Leveraging natural language processing to support automated assessment and feedback for student open responses in mathematics</title><title>Journal of computer assisted learning</title><description>Background
Teachers often rely on the use of open‐ended questions to assess students' conceptual understanding of assigned content. Particularly in the context of mathematics; teachers use these types of questions to gain insight into the processes and strategies adopted by students in solving mathematical problems beyond what is possible through more close‐ended problem types. While these types of problems are valuable to teachers, the variation in student responses to these questions makes it difficult, and time‐consuming, to evaluate and provide directed feedback. It is a well‐studied concept that feedback, both in terms of a numeric score but more importantly in the form of teacher‐authored comments, can help guide students as to how to improve, leading to increased learning. It is for this reason that teachers need better support not only for assessing students' work but also in providing meaningful and directed feedback to students.
Objectives
In this paper, we seek to develop, evaluate, and examine machine learning models that support automated open response assessment and feedback.
Methods
We build upon the prior research in the automatic assessment of student responses to open‐ended problems and introduce a novel approach that leverages student log data combined with machine learning and natural language processing methods. Utilizing sentence‐level semantic representations of student responses to open‐ended questions, we propose a collaborative filtering‐based approach to both predict student scores as well as recommend appropriate feedback messages for teachers to send to their students.
Results and Conclusion
We find that our method outperforms previously published benchmarks across three different metrics for the task of predicting student performance. Through an error analysis, we identify several areas where future works may be able to improve upon our approach.
Lay Description
What is already known about this topic
Open‐ended questions are used by teachers in the domain of mathematics to assess their students understanding but automated support for these types of questions are limited in online learning platforms. Recent advancements in areas of machine learning and natural language processing have led to promising results in a range of domains and applications.
What this paper adds
Emulating how teachers identify similar student answers can be used to build tools that support teachers in assessing and providing feedback to student open‐ended work.
Implications for practice
Developing better automated supports can increase the amount of direct feedback students receive to guide their learning.</description><subject>Academic Achievement</subject><subject>Artificial Intelligence</subject><subject>automated assessment</subject><subject>Automation</subject><subject>Computer Assisted Testing</subject><subject>Distance learning</subject><subject>Domains</subject><subject>Electronic Learning</subject><subject>Error analysis</subject><subject>Error Analysis (Language)</subject><subject>Feedback</subject><subject>Feedback (Response)</subject><subject>feedback recommendation</subject><subject>Language Processing</subject><subject>Machine learning</subject><subject>Mathematical analysis</subject><subject>Mathematics</subject><subject>Mathematics Achievement</subject><subject>Mathematics Education</subject><subject>Mathematics Tests</subject><subject>Natural Language Processing</subject><subject>open responses</subject><subject>Performance prediction</subject><subject>Prediction</subject><subject>Questions</subject><subject>Semantics</subject><subject>sentence embeddings</subject><subject>similarity</subject><subject>Student Evaluation</subject><subject>Student Journals</subject><subject>Students</subject><subject>Teachers</subject><subject>Teaching Methods</subject><subject>Test Format</subject><issn>0266-4909</issn><issn>1365-2729</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><recordid>eNp9kM1r3DAQxUVJoJtNL70HBL0VvJUsy7KOYclHl4Ve0rMZy6OtN17JkeyWPeR_jxyXHjsHDej95g3zCPnM2Yan-nY00G94rrT4QFZclDLLVa4vyIrlZZkVmumP5CrGI2NM6bJakdc9_sYAh84dqINxCtDTHtxhggPSIXiDMc7a6GmchsGHkcI0-hOM2FKIMckndOnTtdQitg2YZ2p9oHGc2lnwAzoaMA7eJZh2jqbZX5iezsRrcmmhj_jpb1-Tn_d3T9vHbP_j4fv2dp8ZIbXIJGulLhthVaGQCQsGoFACQBeGMd1IZRSD0vAcWGNb0chSV7mCyrbaNmliTb4svumilwnjWB_9FFxaWecVV7oohawS9XWhTPAxBrT1ELoThHPNWT3HW8_x1u_xJvhmgTF05h94t-NCVVLJpPNF_9P1eP6PU71Lhy6eb1yeiec</recordid><startdate>202306</startdate><enddate>202306</enddate><creator>Botelho, Anthony</creator><creator>Baral, Sami</creator><creator>Erickson, John A.</creator><creator>Benachamardi, Priyanka</creator><creator>Heffernan, Neil T.</creator><general>John Wiley & Sons, Inc</general><general>Wiley</general><general>Wiley Subscription Services, Inc</general><scope>7SW</scope><scope>BJH</scope><scope>BNH</scope><scope>BNI</scope><scope>BNJ</scope><scope>BNO</scope><scope>ERI</scope><scope>PET</scope><scope>REK</scope><scope>WWN</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0002-6185-5841</orcidid></search><sort><creationdate>202306</creationdate><title>Leveraging natural language processing to support automated assessment and feedback for student open responses in mathematics</title><author>Botelho, Anthony ; Baral, Sami ; Erickson, John A. ; Benachamardi, Priyanka ; Heffernan, Neil T.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c3593-50d596b3f747e03facaa473aa94c009b57c70a6c12a0bfd3b569827a8fd9fbe03</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Academic Achievement</topic><topic>Artificial Intelligence</topic><topic>automated assessment</topic><topic>Automation</topic><topic>Computer Assisted Testing</topic><topic>Distance learning</topic><topic>Domains</topic><topic>Electronic Learning</topic><topic>Error analysis</topic><topic>Error Analysis (Language)</topic><topic>Feedback</topic><topic>Feedback (Response)</topic><topic>feedback recommendation</topic><topic>Language Processing</topic><topic>Machine learning</topic><topic>Mathematical analysis</topic><topic>Mathematics</topic><topic>Mathematics Achievement</topic><topic>Mathematics Education</topic><topic>Mathematics Tests</topic><topic>Natural Language Processing</topic><topic>open responses</topic><topic>Performance prediction</topic><topic>Prediction</topic><topic>Questions</topic><topic>Semantics</topic><topic>sentence embeddings</topic><topic>similarity</topic><topic>Student Evaluation</topic><topic>Student Journals</topic><topic>Students</topic><topic>Teachers</topic><topic>Teaching Methods</topic><topic>Test Format</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Botelho, Anthony</creatorcontrib><creatorcontrib>Baral, Sami</creatorcontrib><creatorcontrib>Erickson, John A.</creatorcontrib><creatorcontrib>Benachamardi, Priyanka</creatorcontrib><creatorcontrib>Heffernan, Neil T.</creatorcontrib><collection>ERIC</collection><collection>ERIC (Ovid)</collection><collection>ERIC</collection><collection>ERIC</collection><collection>ERIC (Legacy Platform)</collection><collection>ERIC( SilverPlatter )</collection><collection>ERIC</collection><collection>ERIC PlusText (Legacy Platform)</collection><collection>Education Resources Information Center (ERIC)</collection><collection>ERIC</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Journal of computer assisted learning</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Botelho, Anthony</au><au>Baral, Sami</au><au>Erickson, John A.</au><au>Benachamardi, Priyanka</au><au>Heffernan, Neil T.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><ericid>EJ1378575</ericid><atitle>Leveraging natural language processing to support automated assessment and feedback for student open responses in mathematics</atitle><jtitle>Journal of computer assisted learning</jtitle><date>2023-06</date><risdate>2023</risdate><volume>39</volume><issue>3</issue><spage>823</spage><epage>840</epage><pages>823-840</pages><issn>0266-4909</issn><eissn>1365-2729</eissn><abstract>Background
Teachers often rely on the use of open‐ended questions to assess students' conceptual understanding of assigned content. Particularly in the context of mathematics; teachers use these types of questions to gain insight into the processes and strategies adopted by students in solving mathematical problems beyond what is possible through more close‐ended problem types. While these types of problems are valuable to teachers, the variation in student responses to these questions makes it difficult, and time‐consuming, to evaluate and provide directed feedback. It is a well‐studied concept that feedback, both in terms of a numeric score but more importantly in the form of teacher‐authored comments, can help guide students as to how to improve, leading to increased learning. It is for this reason that teachers need better support not only for assessing students' work but also in providing meaningful and directed feedback to students.
Objectives
In this paper, we seek to develop, evaluate, and examine machine learning models that support automated open response assessment and feedback.
Methods
We build upon the prior research in the automatic assessment of student responses to open‐ended problems and introduce a novel approach that leverages student log data combined with machine learning and natural language processing methods. Utilizing sentence‐level semantic representations of student responses to open‐ended questions, we propose a collaborative filtering‐based approach to both predict student scores as well as recommend appropriate feedback messages for teachers to send to their students.
Results and Conclusion
We find that our method outperforms previously published benchmarks across three different metrics for the task of predicting student performance. Through an error analysis, we identify several areas where future works may be able to improve upon our approach.
Lay Description
What is already known about this topic
Open‐ended questions are used by teachers in the domain of mathematics to assess their students understanding but automated support for these types of questions are limited in online learning platforms. Recent advancements in areas of machine learning and natural language processing have led to promising results in a range of domains and applications.
What this paper adds
Emulating how teachers identify similar student answers can be used to build tools that support teachers in assessing and providing feedback to student open‐ended work.
Implications for practice
Developing better automated supports can increase the amount of direct feedback students receive to guide their learning.</abstract><cop>Chichester, UK</cop><pub>John Wiley & Sons, Inc</pub><doi>10.1111/jcal.12793</doi><tpages>18</tpages><orcidid>https://orcid.org/0000-0002-6185-5841</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0266-4909 |
ispartof | Journal of computer assisted learning, 2023-06, Vol.39 (3), p.823-840 |
issn | 0266-4909 1365-2729 |
language | eng |
recordid | cdi_proquest_journals_2817946358 |
source | Access via Wiley Online Library |
subjects | Academic Achievement Artificial Intelligence automated assessment Automation Computer Assisted Testing Distance learning Domains Electronic Learning Error analysis Error Analysis (Language) Feedback Feedback (Response) feedback recommendation Language Processing Machine learning Mathematical analysis Mathematics Mathematics Achievement Mathematics Education Mathematics Tests Natural Language Processing open responses Performance prediction Prediction Questions Semantics sentence embeddings similarity Student Evaluation Student Journals Students Teachers Teaching Methods Test Format |
title | Leveraging natural language processing to support automated assessment and feedback for student open responses in mathematics |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-16T09%3A38%3A49IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Leveraging%20natural%20language%20processing%20to%20support%20automated%20assessment%20and%20feedback%20for%20student%20open%20responses%20in%20mathematics&rft.jtitle=Journal%20of%20computer%20assisted%20learning&rft.au=Botelho,%20Anthony&rft.date=2023-06&rft.volume=39&rft.issue=3&rft.spage=823&rft.epage=840&rft.pages=823-840&rft.issn=0266-4909&rft.eissn=1365-2729&rft_id=info:doi/10.1111/jcal.12793&rft_dat=%3Cproquest_cross%3E2817946358%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2817946358&rft_id=info:pmid/&rft_ericid=EJ1378575&rfr_iscdi=true |