Your browser doesn't support javascript.
MPSKT: Multi-head ProbSparse Self-Attention for Knowledge Tracing
6th International Conference on Computer Science and Application Engineering, CSAE 2022 ; 2022.
Article in English | Scopus | ID: covidwho-2194123
ABSTRACT
Over the past two years, COVID-19 has led to a widespread rise in online education, and knowledge tracing has been used on various educational platforms. However, most existing knowledge tracing models still suffer from long-term dependence. To address this problem, we propose a Multi-head ProbSparse Self-Attention for Knowledge Tracing(MPSKT). Firstly, the temporal convolutional network is used to encode the position information of the input sequence. Then, the Multi-head ProbSparse Self-Attention in the encoder and decoder blocks is used to capture the relationship between the input sequences, and the convolution and pooling layers in the encoder block are used to shorten the length of the input sequence, which greatly reduces the time complexity of the model and better solves the problem of long-term dependence of the model. Finally, experimental results on three public online education datasets demonstrate the effectiveness of our proposed model. © 2022 Association for Computing Machinery.
Keywords

Full text: Available Collection: Databases of international organizations Database: Scopus Language: English Journal: 6th International Conference on Computer Science and Application Engineering, CSAE 2022 Year: 2022 Document Type: Article

Similar

MEDLINE

...
LILACS

LIS


Full text: Available Collection: Databases of international organizations Database: Scopus Language: English Journal: 6th International Conference on Computer Science and Application Engineering, CSAE 2022 Year: 2022 Document Type: Article