Consistency and Monotonicity Regularization for Neural Knowledge Tracing
Knowledge Tracing (KT), tracking a human's knowledge acquisition, is a central component in online learning and AI in Education. In this paper, we present a simple, yet effective strategy to improve the generalization ability of KT models: we propose three types of novel data augmentation, coin...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | |
container_volume | |
creator | Lee, Seewoo Choi, Youngduck Park, Juneyoung Kim, Byungsoo Shin, Jinwoo |
description | Knowledge Tracing (KT), tracking a human's knowledge acquisition, is a
central component in online learning and AI in Education. In this paper, we
present a simple, yet effective strategy to improve the generalization ability
of KT models: we propose three types of novel data augmentation, coined
replacement, insertion, and deletion, along with corresponding regularization
losses that impose certain consistency or monotonicity biases on the model's
predictions for the original and augmented sequence. Extensive experiments on
various KT benchmarks show that our regularization scheme consistently improves
the model performances, under 3 widely-used neural networks and 4 public
benchmarks, e.g., it yields 6.3% improvement in AUC under the DKT model and the
ASSISTmentsChall dataset. |
doi_str_mv | 10.48550/arxiv.2105.00607 |
format | Article |
fullrecord | <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2105_00607</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2105_00607</sourcerecordid><originalsourceid>FETCH-LOGICAL-a677-cc9974825a1a58f0efce709adab5102b6605f25f35b838ab6804bcf5876a94fc3</originalsourceid><addsrcrecordid>eNotz71OwzAUQGEvDKjwAEz4BRJufq7tjFUEFNGChLJH144dWUpt5KRAeHpEYTrbkT7GbgrIa4UId5S-_EdeFoA5gAB5yXZtDLOfFxvMyikM_BBDXGLwxi8rf7PjaaLkv2nxMXAXE3-xp0QTfw7xc7LDaHmXyPgwXrELR9Nsr_-7Yd3Dfdfusv3r41O73WckpMyMaRpZqxKpIFQOrDNWQkMDaSyg1EIAuhJdhVpVirRQUGvjUElBTe1MtWG3f9szpX9P_khp7X9J_ZlU_QA1NkeQ</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Consistency and Monotonicity Regularization for Neural Knowledge Tracing</title><source>arXiv.org</source><creator>Lee, Seewoo ; Choi, Youngduck ; Park, Juneyoung ; Kim, Byungsoo ; Shin, Jinwoo</creator><creatorcontrib>Lee, Seewoo ; Choi, Youngduck ; Park, Juneyoung ; Kim, Byungsoo ; Shin, Jinwoo</creatorcontrib><description>Knowledge Tracing (KT), tracking a human's knowledge acquisition, is a
central component in online learning and AI in Education. In this paper, we
present a simple, yet effective strategy to improve the generalization ability
of KT models: we propose three types of novel data augmentation, coined
replacement, insertion, and deletion, along with corresponding regularization
losses that impose certain consistency or monotonicity biases on the model's
predictions for the original and augmented sequence. Extensive experiments on
various KT benchmarks show that our regularization scheme consistently improves
the model performances, under 3 widely-used neural networks and 4 public
benchmarks, e.g., it yields 6.3% improvement in AUC under the DKT model and the
ASSISTmentsChall dataset.</description><identifier>DOI: 10.48550/arxiv.2105.00607</identifier><language>eng</language><subject>Computer Science - Artificial Intelligence ; Computer Science - Learning</subject><creationdate>2021-05</creationdate><rights>http://creativecommons.org/licenses/by/4.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,885</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2105.00607$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2105.00607$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Lee, Seewoo</creatorcontrib><creatorcontrib>Choi, Youngduck</creatorcontrib><creatorcontrib>Park, Juneyoung</creatorcontrib><creatorcontrib>Kim, Byungsoo</creatorcontrib><creatorcontrib>Shin, Jinwoo</creatorcontrib><title>Consistency and Monotonicity Regularization for Neural Knowledge Tracing</title><description>Knowledge Tracing (KT), tracking a human's knowledge acquisition, is a
central component in online learning and AI in Education. In this paper, we
present a simple, yet effective strategy to improve the generalization ability
of KT models: we propose three types of novel data augmentation, coined
replacement, insertion, and deletion, along with corresponding regularization
losses that impose certain consistency or monotonicity biases on the model's
predictions for the original and augmented sequence. Extensive experiments on
various KT benchmarks show that our regularization scheme consistently improves
the model performances, under 3 widely-used neural networks and 4 public
benchmarks, e.g., it yields 6.3% improvement in AUC under the DKT model and the
ASSISTmentsChall dataset.</description><subject>Computer Science - Artificial Intelligence</subject><subject>Computer Science - Learning</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotz71OwzAUQGEvDKjwAEz4BRJufq7tjFUEFNGChLJH144dWUpt5KRAeHpEYTrbkT7GbgrIa4UId5S-_EdeFoA5gAB5yXZtDLOfFxvMyikM_BBDXGLwxi8rf7PjaaLkv2nxMXAXE3-xp0QTfw7xc7LDaHmXyPgwXrELR9Nsr_-7Yd3Dfdfusv3r41O73WckpMyMaRpZqxKpIFQOrDNWQkMDaSyg1EIAuhJdhVpVirRQUGvjUElBTe1MtWG3f9szpX9P_khp7X9J_ZlU_QA1NkeQ</recordid><startdate>20210502</startdate><enddate>20210502</enddate><creator>Lee, Seewoo</creator><creator>Choi, Youngduck</creator><creator>Park, Juneyoung</creator><creator>Kim, Byungsoo</creator><creator>Shin, Jinwoo</creator><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20210502</creationdate><title>Consistency and Monotonicity Regularization for Neural Knowledge Tracing</title><author>Lee, Seewoo ; Choi, Youngduck ; Park, Juneyoung ; Kim, Byungsoo ; Shin, Jinwoo</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a677-cc9974825a1a58f0efce709adab5102b6605f25f35b838ab6804bcf5876a94fc3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Computer Science - Artificial Intelligence</topic><topic>Computer Science - Learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Lee, Seewoo</creatorcontrib><creatorcontrib>Choi, Youngduck</creatorcontrib><creatorcontrib>Park, Juneyoung</creatorcontrib><creatorcontrib>Kim, Byungsoo</creatorcontrib><creatorcontrib>Shin, Jinwoo</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Lee, Seewoo</au><au>Choi, Youngduck</au><au>Park, Juneyoung</au><au>Kim, Byungsoo</au><au>Shin, Jinwoo</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Consistency and Monotonicity Regularization for Neural Knowledge Tracing</atitle><date>2021-05-02</date><risdate>2021</risdate><abstract>Knowledge Tracing (KT), tracking a human's knowledge acquisition, is a
central component in online learning and AI in Education. In this paper, we
present a simple, yet effective strategy to improve the generalization ability
of KT models: we propose three types of novel data augmentation, coined
replacement, insertion, and deletion, along with corresponding regularization
losses that impose certain consistency or monotonicity biases on the model's
predictions for the original and augmented sequence. Extensive experiments on
various KT benchmarks show that our regularization scheme consistently improves
the model performances, under 3 widely-used neural networks and 4 public
benchmarks, e.g., it yields 6.3% improvement in AUC under the DKT model and the
ASSISTmentsChall dataset.</abstract><doi>10.48550/arxiv.2105.00607</doi><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | DOI: 10.48550/arxiv.2105.00607 |
ispartof | |
issn | |
language | eng |
recordid | cdi_arxiv_primary_2105_00607 |
source | arXiv.org |
subjects | Computer Science - Artificial Intelligence Computer Science - Learning |
title | Consistency and Monotonicity Regularization for Neural Knowledge Tracing |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-08T11%3A32%3A04IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Consistency%20and%20Monotonicity%20Regularization%20for%20Neural%20Knowledge%20Tracing&rft.au=Lee,%20Seewoo&rft.date=2021-05-02&rft_id=info:doi/10.48550/arxiv.2105.00607&rft_dat=%3Carxiv_GOX%3E2105_00607%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |