Olle Torstensson
2025
A Grammar-Based Method for Instilling Empirical Dependency Structure in LLMs
Olle Torstensson
|
Oskar Holmström
Proceedings of the 9th Workshop on Constraint Grammar and Finite State NLP
We investigate whether synthetic pretraining data generated from a formal grammar modeling syntactic dependencies can improve English language models. Building upon the structured pretraining data approach of Papadimitriou and Jurafsky (2023), we develop a grammar that more closely mirrors empirical dependency structures. Our results are negative – this type of pretraining significantly degrades model performance, with both our and their pretraining approach performing worse than no pretraining at all. We analyze potential explanations for these findings and discuss implications for future work on structured-data pretraining.