Weakly supervised setting for learning concept prerequisite relations using multi-head attention variational graph auto-encoders
An increasing number of learners can benefit from educational resources in Massive Open Online Courses (MOOCs) through self-regulated learning. However, it is difficult for learners to organize a suitable learning path from educational resources in MOOCs if they don’t know the prerequisite relation...
Gespeichert in:
Veröffentlicht in: | Knowledge-based systems 2022-07, Vol.247, p.108689, Article 108689 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | An increasing number of learners can benefit from educational resources in Massive Open Online Courses (MOOCs) through self-regulated learning. However, it is difficult for learners to organize a suitable learning path from educational resources in MOOCs if they don’t know the prerequisite relation between concepts. Manually labeling the prerequisite relation between concepts is time-consuming and requires significant domain knowledge. In addition, it isn’t notably easy for large concept datasets to label. How should learners start learning when they face massive knowledge concepts in MOOCs? To address these problems, we propose an end-to-end graph network-based model called Multi-Head Attention Variational Graph Auto-Encoders (MHAVGAE) to automatically label the prerequisite relation between concepts via a resource-concept graph. Firstly, we model a resource-concept graph according to learning resources, concept, and their relations, then introduce the multi-head attention mechanism to operate and compute the hidden representations of each vertex over the resource-concept graph. The purpose is to reduce the cognitive difference of manually labeling and consider the mutual influence between vertices in the resource concept graph. Secondly, we design a gated fusion mechanism to fuse the feature of the resource and concept graphs to enrich concept features. Thirdly, we propose a metric named Resource Prerequisite Reference Distance (RPRD), which generates inaccurate concept prerequisite relations to help us reduce manual labeling and then extend MHAVGAE with the weakly supervised setting for learning concept prerequisite relations. Finally, we conduct numerous experiments to demonstrate the effectiveness of MHAVGAE across multiple widely used metrics compared with the state-of-the-art methods. The experimental results show that the performance of MHAVGAE almost outperforms all the baseline methods and the weakly supervised setting of MHAVGAE is beneficial. |
---|---|
ISSN: | 0950-7051 1872-7409 |
DOI: | 10.1016/j.knosys.2022.108689 |