Task-Informed Anti-Curriculum by Masking Improves Downstream Performance on Text

Jarca Andrei, Florinel Alin Croitoru, Radu Tudor Ionescu


Abstract
Masked language modeling has become a widely adopted unsupervised technique to pre-train large language models (LLMs). However, the process of selecting tokens for masking is random, and the percentage of masked tokens is typically fixed for the entire training process. In this paper, we propose to adjust the masking ratio and to decide which tokens to mask based on a novel task-informed anti-curriculum learning scheme. First, we harness task-specific knowledge about useful and harmful tokens in order to determine which tokens to mask. Second, we propose a cyclic decaying masking ratio, which corresponds to an anti-curriculum schedule (from hard to easy). We exemplify our novel task-informed anti-curriculum by masking (TIACBM) approach across three diverse downstream tasks: sentiment analysis, text classification by topic, and authorship attribution. Our findings suggest that TIACBM enhances the ability of the model to focus on key task-relevant features, contributing to statistically significant performance gains across tasks. We release our code at https://github.com/JarcaAndrei/TIACBM.
Anthology ID:
2025.findings-acl.113
Volume:
Findings of the Association for Computational Linguistics: ACL 2025
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2192–2201
Language:
URL:
https://preview.aclanthology.org/display_plenaries/2025.findings-acl.113/
DOI:
Bibkey:
Cite (ACL):
Jarca Andrei, Florinel Alin Croitoru, and Radu Tudor Ionescu. 2025. Task-Informed Anti-Curriculum by Masking Improves Downstream Performance on Text. In Findings of the Association for Computational Linguistics: ACL 2025, pages 2192–2201, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Task-Informed Anti-Curriculum by Masking Improves Downstream Performance on Text (Andrei et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/display_plenaries/2025.findings-acl.113.pdf