Capturing Online SRC/ORC Effort with Memory Measures from a Minimalist Parser

Aniello De Santo


Abstract
A parser for Minimalist grammars (Stabler, 2013) has been shown to successfully model sentence processing preferences across an array of languages and phenomena when combined with complexity metrics that relate parsing behavior to memory usage (Gerth, 2015; Graf et al., 2017; De Santo, 2020, a.o.). This model provides a quantifiable theory of the effects of fine-grained grammatical structure on cognitive cost, and can help strengthen the link between generative syntactic theory and sentence processing.However, work on it has focused on offline asymmetries.Here, we extend this approach by showing how memory-based measures of effort that explicitly consider minimalist-like structure-building operations improve our ability to account for word-by-word (online) behavioral data.
Anthology ID:
2025.cmcl-1.5
Volume:
Proceedings of the Workshop on Cognitive Modeling and Computational Linguistics
Month:
May
Year:
2025
Address:
Albuquerque, New Mexico, USA
Editors:
Tatsuki Kuribayashi, Giulia Rambelli, Ece Takmaz, Philipp Wicke, Jixing Li, Byung-Doh Oh
Venues:
CMCL | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
24–35
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.cmcl-1.5/
DOI:
Bibkey:
Cite (ACL):
Aniello De Santo. 2025. Capturing Online SRC/ORC Effort with Memory Measures from a Minimalist Parser. In Proceedings of the Workshop on Cognitive Modeling and Computational Linguistics, pages 24–35, Albuquerque, New Mexico, USA. Association for Computational Linguistics.
Cite (Informal):
Capturing Online SRC/ORC Effort with Memory Measures from a Minimalist Parser (De Santo, CMCL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.cmcl-1.5.pdf