Maxim Surkov
2022
Do Data-based Curricula Work?
Maxim Surkov
|
Vladislav Mosin
|
Ivan Yamshchikov
Proceedings of the Third Workshop on Insights from Negative Results in NLP
Current state-of-the-art NLP systems use large neural networks that require extensive computational resources for training. Inspired by human knowledge acquisition, researchers have proposed curriculum learning - sequencing tasks (task-based curricula) or ordering and sampling the datasets (data-based curricula) that facilitate training. This work investigates the benefits of data-based curriculum learning for large language models such as BERT and T5. We experiment with various curricula based on complexity measures and different sampling strategies. Extensive experiments on several NLP tasks show that curricula based on various complexity measures rarely have any benefits, while random sampling performs either as well or better than curricula.
Search