University of Wollongong
Browse

File(s) not publicly available

CAKT: Coupling contrastive learning with attention networks for interpretable knowledge tracing

journal contribution
posted on 2024-11-17, 13:48 authored by Shuaishuai Zu, Li Li, Jun Shen
In intelligent systems, knowledge tracing (KT) plays a vital role in providing personalized education. Existing KT methods often rely on students' learning interactions to trace their knowledge states by predicting future performance on the given questions. While deep learning-based KT models have achieved improved predictive performance compared with traditional KT models, they often lack interpretability into the captured knowledge states. Furthermore, previous works generally neglect the multiple semantic information contained in knowledge states and sparse learning interactions. In this paper, we propose a novel model named CAKT that couples contrastive learning with attention networks for interpretable knowledge tracing. Specifically, we use three attention-based encoders to model three dynamic factors of the Item Response Theory (IRT) model, based on designed learning sequences. Then, we identify two key properties related to the knowledge states and learning interactions: consistency and separability. We utilize contrastive learning to incorporate the semantic information of the above properties into the representations of knowledge states and learning interactions. With the training goal of contrastive learning, we can obtain more representative representations of them. Extensive experiments demonstrate the excellent predictive performance of CAKT and the positive effects of considering the two properties. Additionally, CAKT can exhibit high interpretability for captured knowledge states.

Funding

National Natural Science Foundation of China (61877051)

History

Journal title

Proceedings of the International Joint Conference on Neural Networks

Volume

2023-June

Language

English

Usage metrics

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC