Which Word Orders Facilitate Length Generalization in LMs? An Investigation with GCG-Based Artificial Languages

Nadine El-Naggar, Tatsuki Kuribayashi, Ted Briscoe


Abstract
Whether language models (LMs) have inductive biases that favor typologically frequent grammatical properties over rare, implausible ones has been investigated, typically using artificial languages (ALs) (White and Cotterell, 2021; Kuribayashi et al., 2024). In this paper, we extend these works from two perspectives. First, we extend their context-free AL formalization by adopting Generalized Categorial Grammar (GCG) (Wood, 2014), which allows ALs to cover attested but previously overlooked constructions, such as unbounded dependency and mildly context-sensitive structures. Second, our evaluation focuses more on the generalization ability of LMs to process unseen longer test sentences. Thus, our ALs better capture features of natural languages and our experimental paradigm leads to clearer conclusions — typologically plausible word orders tend to be easier for LMs to productively generalize.
Anthology ID:
2025.emnlp-main.1803
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
35587–35601
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1803/
DOI:
Bibkey:
Cite (ACL):
Nadine El-Naggar, Tatsuki Kuribayashi, and Ted Briscoe. 2025. Which Word Orders Facilitate Length Generalization in LMs? An Investigation with GCG-Based Artificial Languages. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 35587–35601, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Which Word Orders Facilitate Length Generalization in LMs? An Investigation with GCG-Based Artificial Languages (El-Naggar et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1803.pdf
Checklist:
 2025.emnlp-main.1803.checklist.pdf