Abstract
Concept prerequisite learning (CPL) plays a key role in developing technologies that assist people to learn a new complex topic or concept. Previous work commonly assumes that all concepts are given at training time and solely focuses on predicting the unseen prerequisite relationships between them. However, many real-world scenarios deal with concepts that are left undiscovered at training time, which is relatively unexplored. This paper studies this problem and proposes a novel alternating knowledge distillation approach to take advantage of both content- and graph-based models for this task. Extensive experiments on three public benchmarks demonstrate up to 10% improvements in terms of F1 score.- Anthology ID:
- 2022.emnlp-main.585
- Volume:
- Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
- Month:
- December
- Year:
- 2022
- Address:
- Abu Dhabi, United Arab Emirates
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 8542–8548
- Language:
- URL:
- https://aclanthology.org/2022.emnlp-main.585
- DOI:
- Cite (ACL):
- Yaxin Zhu and Hamed Zamani. 2022. Predicting Prerequisite Relations for Unseen Concepts. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 8542–8548, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
- Cite (Informal):
- Predicting Prerequisite Relations for Unseen Concepts (Zhu & Zamani, EMNLP 2022)
- PDF:
- https://preview.aclanthology.org/starsem-semeval-split/2022.emnlp-main.585.pdf