Less is More: Pre-Training Cross-Lingual Small-Scale Language Models with Cognitively-Plausible Curriculum Learning Strategies
Suchir Salhan, Richard Diehl Martinez, Zébulon Goriely, Paula Buttery
Abstract
Curriculum Learning has been a popular strategy to improve the cognitive plausibility of Small-Scale Language Models (SSLMs) in the BabyLM Challenge. However, it has not led to considerable improvements over non-curriculum models. We assess whether theoretical linguistic acquisition theories can be used to specify more fine-grained curriculum learning strategies, creating age-ordered corpora of Child-Directed Speech for four typologically distant language families to implement SSLMs and acquisition-inspired curricula cross-lingually. Comparing the success of three objective curricula (Growing, Inwards & MMM) that precisely replicate the predictions of acquisition theories on a standard SSLM architecture, we find fine-grained acquisition-inspired curricula can outperform non-curriculum baselines and performance benefits of curricula strategies in SSLMs can be derived by specifying fine-grained language-specific curricula that precisely replicate language acquisition theories.- Anthology ID:
- 2024.conll-babylm.15
- Volume:
- The 2nd BabyLM Challenge at the 28th Conference on Computational Natural Language Learning
- Month:
- November
- Year:
- 2024
- Address:
- Miami, FL, USA
- Editors:
- Michael Y. Hu, Aaron Mueller, Candace Ross, Adina Williams, Tal Linzen, Chengxu Zhuang, Leshem Choshen, Ryan Cotterell, Alex Warstadt, Ethan Gotlieb Wilcox
- Venues:
- CoNLL | BabyLM | WS
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 174–188
- Language:
- URL:
- https://preview.aclanthology.org/fix-sig-urls/2024.conll-babylm.15/
- DOI:
- Cite (ACL):
- Suchir Salhan, Richard Diehl Martinez, Zébulon Goriely, and Paula Buttery. 2024. Less is More: Pre-Training Cross-Lingual Small-Scale Language Models with Cognitively-Plausible Curriculum Learning Strategies. In The 2nd BabyLM Challenge at the 28th Conference on Computational Natural Language Learning, pages 174–188, Miami, FL, USA. Association for Computational Linguistics.
- Cite (Informal):
- Less is More: Pre-Training Cross-Lingual Small-Scale Language Models with Cognitively-Plausible Curriculum Learning Strategies (Salhan et al., CoNLL-BabyLM 2024)
- PDF:
- https://preview.aclanthology.org/fix-sig-urls/2024.conll-babylm.15.pdf