Abstract
English relative clauses are a critical test case for theories of syntactic processing. Expectation- and memory-based accounts make opposing predictions, and behavioral experiments have found mixed results. We present a technical extension of Lossy Context Surprisal (LCS) and use it to model relative clause processing in three behavioral experiments. LCS predicts key results at distinct retention rates, showing that task-dependent memory demands can account for discrepant behavioral patterns in the literature.- Anthology ID:
- 2024.conll-1.4
- Volume:
- Proceedings of the 28th Conference on Computational Natural Language Learning
- Month:
- November
- Year:
- 2024
- Address:
- Miami, FL, USA
- Editors:
- Libby Barak, Malihe Alikhani
- Venue:
- CoNLL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 36–45
- Language:
- URL:
- https://aclanthology.org/2024.conll-1.4
- DOI:
- 10.18653/v1/2024.conll-1.4
- Cite (ACL):
- Kate McCurdy and Michael Hahn. 2024. Lossy Context Surprisal Predicts Task-Dependent Patterns in Relative Clause Processing. In Proceedings of the 28th Conference on Computational Natural Language Learning, pages 36–45, Miami, FL, USA. Association for Computational Linguistics.
- Cite (Informal):
- Lossy Context Surprisal Predicts Task-Dependent Patterns in Relative Clause Processing (McCurdy & Hahn, CoNLL 2024)
- PDF:
- https://preview.aclanthology.org/dois-2013-emnlp/2024.conll-1.4.pdf