Knowing "When" and "Where": Temporal-ASTNN for Student Learning Progression in Novice Programming Tasks

As students learn how to program, both their programming code and their understanding of it evolves over time. In this work, we present a general data-driven approach, named "Temporal-ASTNN" for modeling student learning progression in open-ended programming domains. Temporal-ASTNN combine...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International Educational Data Mining Society 2021
Hauptverfasser: Mao, Ye, Shi, Yang, Marwan, Samiha, Price, Thomas W, Barnes, Tiffany, Chi, Min
Format: Report
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:As students learn how to program, both their programming code and their understanding of it evolves over time. In this work, we present a general data-driven approach, named "Temporal-ASTNN" for modeling student learning progression in open-ended programming domains. Temporal-ASTNN combines a novel neural network model based on abstract syntactic trees (AST), named ASTNN, and Long-Short Term Memory (LSTM) model. ASTNN handles the "linguistic" nature of student programming code, while LSTM handles the "temporal" nature of student learning progression. The effectiveness of ASTNN is first compared against other models including a state-of-the-art algorithm, Code2Vec across two programming domains: iSnap and Java on the task of program classification ("correct" or "incorrect"). Then the proposed temporal-ASTNN is compared against the original ASTNN and other temporal models on a challenging task of student success early prediction. Our results show that Temporal-ASTNN can achieve the best performance with only the first 4-minute temporal data and it continues to outperform all other models with longer trajectories. [For the full proceedings, see ED615472.]