An Effective Incorporating Heterogeneous Knowledge Curriculum Learning for Sequence Labeling

Xuemei Tang, Jun Wang, Qi Su, Chu-Ren Huang, Jinghang Gu


Abstract
Sequence labeling models often benefit from incorporating external knowledge. However, this practice introduces data heterogeneity and complicates the model with additional modules, leading to increased expenses for training a high-performing model. To address this challenge, we propose a dual-stage curriculum learning (DCL) framework specifically designed for sequence labeling tasks. The DCL framework enhances training by gradually introducing data instances from easy to hard. Additionally, we introduce a dynamic metric for evaluating the difficulty levels of sequence labeling tasks. Experiments on several sequence labeling datasets show that our model enhances performance and accelerates training, mitigating the slow training issue of complex models.
Anthology ID:
2025.acl-short.38
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
495–503
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.acl-short.38/
DOI:
Bibkey:
Cite (ACL):
Xuemei Tang, Jun Wang, Qi Su, Chu-Ren Huang, and Jinghang Gu. 2025. An Effective Incorporating Heterogeneous Knowledge Curriculum Learning for Sequence Labeling. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 495–503, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
An Effective Incorporating Heterogeneous Knowledge Curriculum Learning for Sequence Labeling (Tang et al., ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.acl-short.38.pdf