Unsupervised Cross-Domain Prerequisite Chain Learning using Variational Graph Autoencoders

Irene Li, Vanessa Yan, Tianxiao Li, Rihao Qu, Dragomir Radev


Abstract
Learning prerequisite chains is an important task for one to pick up knowledge efficiently in both known and unknown domains. For example, one may be an expert in the natural language processing (NLP) domain, but want to determine the best order in which to learn new concepts in an unfamiliar Computer Vision domain (CV). Both domains share some common concepts, such as machine learning basics and deep learning models. In this paper, we solve the task of unsupervised cross-domain concept prerequisite chain learning, using an optimized variational graph autoencoder. Our model learns to transfer concept prerequisite relations from an information-rich domain (source domain) to an information-poor domain (target domain), substantially surpassing other baseline models. In addition, we expand an existing dataset by introducing two new domains—-CV and Bioinformatics (BIO). The annotated data and resources as well as the code will be made publicly available.
Anthology ID:
2021.acl-short.127
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers)
Month:
August
Year:
2021
Address:
Online
Editors:
Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1005–1011
Language:
URL:
https://aclanthology.org/2021.acl-short.127
DOI:
10.18653/v1/2021.acl-short.127
Bibkey:
Cite (ACL):
Irene Li, Vanessa Yan, Tianxiao Li, Rihao Qu, and Dragomir Radev. 2021. Unsupervised Cross-Domain Prerequisite Chain Learning using Variational Graph Autoencoders. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pages 1005–1011, Online. Association for Computational Linguistics.
Cite (Informal):
Unsupervised Cross-Domain Prerequisite Chain Learning using Variational Graph Autoencoders (Li et al., ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl-24-ws-corrections/2021.acl-short.127.pdf
Optional supplementary material:
 2021.acl-short.127.OptionalSupplementaryMaterial.zip
Video:
 https://preview.aclanthology.org/naacl-24-ws-corrections/2021.acl-short.127.mp4
Data
LectureBank