Abstract
Curriculum learning provides a systematic approach to training. It refines training progressively, tailors training to task requirements, and improves generalization through exposure to diverse examples. We present a curriculum learning approach that builds on existing knowledge about text and graph complexity formalisms for training with text graph data. The core part of our approach is a novel data scheduler, which employs “spaced repetition” and complexity formalisms to guide the training process. We demonstrate the effectiveness of the proposed approach on several text graph tasks and graph neural network architectures. The proposed model gains more and uses less data; consistently prefers text over graph complexity indices throughout training, while the best curricula derived from text and graph complexity indices are equally effective; and it learns transferable curricula across GNN models and datasets. In addition, we find that both node-level (local) and graph-level (global) graph complexity indices, as well as shallow and traditional text complexity indices play a crucial role in effective curriculum learning.- Anthology ID:
- 2023.findings-emnlp.172
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2023
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Houda Bouamor, Juan Pino, Kalika Bali
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 2610–2626
- Language:
- URL:
- https://aclanthology.org/2023.findings-emnlp.172
- DOI:
- 10.18653/v1/2023.findings-emnlp.172
- Cite (ACL):
- Nidhi Vakil and Hadi Amiri. 2023. Complexity-Guided Curriculum Learning for Text Graphs. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 2610–2626, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- Complexity-Guided Curriculum Learning for Text Graphs (Vakil & Amiri, Findings 2023)
- PDF:
- https://preview.aclanthology.org/naacl24-info/2023.findings-emnlp.172.pdf