Leveraging natural language processing to support automated assessment and feedback for student open responses in mathematics

Background Teachers often rely on the use of open‐ended questions to assess students' conceptual understanding of assigned content. Particularly in the context of mathematics; teachers use these types of questions to gain insight into the processes and strategies adopted by students in solving...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of computer assisted learning 2023-06, Vol.39 (3), p.823-840
Hauptverfasser: Botelho, Anthony, Baral, Sami, Erickson, John A., Benachamardi, Priyanka, Heffernan, Neil T.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Background Teachers often rely on the use of open‐ended questions to assess students' conceptual understanding of assigned content. Particularly in the context of mathematics; teachers use these types of questions to gain insight into the processes and strategies adopted by students in solving mathematical problems beyond what is possible through more close‐ended problem types. While these types of problems are valuable to teachers, the variation in student responses to these questions makes it difficult, and time‐consuming, to evaluate and provide directed feedback. It is a well‐studied concept that feedback, both in terms of a numeric score but more importantly in the form of teacher‐authored comments, can help guide students as to how to improve, leading to increased learning. It is for this reason that teachers need better support not only for assessing students' work but also in providing meaningful and directed feedback to students. Objectives In this paper, we seek to develop, evaluate, and examine machine learning models that support automated open response assessment and feedback. Methods We build upon the prior research in the automatic assessment of student responses to open‐ended problems and introduce a novel approach that leverages student log data combined with machine learning and natural language processing methods. Utilizing sentence‐level semantic representations of student responses to open‐ended questions, we propose a collaborative filtering‐based approach to both predict student scores as well as recommend appropriate feedback messages for teachers to send to their students. Results and Conclusion We find that our method outperforms previously published benchmarks across three different metrics for the task of predicting student performance. Through an error analysis, we identify several areas where future works may be able to improve upon our approach. Lay Description What is already known about this topic Open‐ended questions are used by teachers in the domain of mathematics to assess their students understanding but automated support for these types of questions are limited in online learning platforms. Recent advancements in areas of machine learning and natural language processing have led to promising results in a range of domains and applications. What this paper adds Emulating how teachers identify similar student answers can be used to build tools that support teachers in assessing and providing feedback to student open‐ende
ISSN:0266-4909
1365-2729
DOI:10.1111/jcal.12793