R-VGAE: Relational-variational Graph Autoencoder for Unsupervised Prerequisite Chain Learning

Irene Li, Alexander Fabbri, Swapnil Hingmire, Dragomir Radev


Abstract
The task of concept prerequisite chain learning is to automatically determine the existence of prerequisite relationships among concept pairs. In this paper, we frame learning prerequisite relationships among concepts as an unsupervised task with no access to labeled concept pairs during training. We propose a model called the Relational-Variational Graph AutoEncoder (R-VGAE) to predict concept relations within a graph consisting of concept and resource nodes. Results show that our unsupervised approach outperforms graph-based semi-supervised methods and other baseline methods by up to 9.77% and 10.47% in terms of prerequisite relation prediction accuracy and F1 score. Our method is notably the first graph-based model that attempts to make use of deep learning representations for the task of unsupervised prerequisite learning. We also expand an existing corpus which totals 1,717 English Natural Language Processing (NLP)-related lecture slide files and manual concept pair annotations over 322 topics.
Anthology ID:
2020.coling-main.99
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
1147–1157
Language:
URL:
https://aclanthology.org/2020.coling-main.99
DOI:
10.18653/v1/2020.coling-main.99
Bibkey:
Cite (ACL):
Irene Li, Alexander Fabbri, Swapnil Hingmire, and Dragomir Radev. 2020. R-VGAE: Relational-variational Graph Autoencoder for Unsupervised Prerequisite Chain Learning. In Proceedings of the 28th International Conference on Computational Linguistics, pages 1147–1157, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
R-VGAE: Relational-variational Graph Autoencoder for Unsupervised Prerequisite Chain Learning (Li et al., COLING 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2020.coling-main.99.pdf
Code
 Yale-LILY/LectureBank