SlovakBabyLM: Replication of the BabyLM and Sample-efficient Pretraining for a Low-Resource Language

Ľuboš Kriš, Marek Suppa


Abstract
In recent years, we can observe a trend of creating various specific language models (LMs) within the Slavic language family with the Bert architecture. However, with an increasing number of parameters of LM, a larger amount of text is required for good performance, which can hinder the development and creation of LMs for specific languages. Our research is looking for a solution in Curriculum learning(CL) methods that can help us build better models with a lower amount of text in comparison with current LMs, which can help in better prtraining of models with low resource languages(LRL). Therefore, we replicate the BabyLM Challenge in the Slovak language (Dataset: https://huggingface.co/datasets/ubokri/SlovakBabyLM, Code: https://github.com/baucek/Slovakbabylm/tree/main). Additionally, apply CL to test and see the difference in the application of CL methods on the English and Slovak languages and evaluate whether the CL improves performance of LM. Our experiments show that the use of CL methods as preprocessing methods is significant for improving model performance in sentiment analysis and question answering.
Anthology ID:
2025.babylm-main.23
Volume:
Proceedings of the First BabyLM Workshop
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Lucas Charpentier, Leshem Choshen, Ryan Cotterell, Mustafa Omer Gul, Michael Y. Hu, Jing Liu, Jaap Jumelet, Tal Linzen, Aaron Mueller, Candace Ross, Raj Sanjay Shah, Alex Warstadt, Ethan Gotlieb Wilcox, Adina Williams
Venue:
BabyLM
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
301–312
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.babylm-main.23/
DOI:
Bibkey:
Cite (ACL):
Ľuboš Kriš and Marek Suppa. 2025. SlovakBabyLM: Replication of the BabyLM and Sample-efficient Pretraining for a Low-Resource Language. In Proceedings of the First BabyLM Workshop, pages 301–312, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
SlovakBabyLM: Replication of the BabyLM and Sample-efficient Pretraining for a Low-Resource Language (Kriš & Suppa, BabyLM 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.babylm-main.23.pdf