Noisy-context surprisal as a human sentence processing cost model

Richard Futrell, Roger Levy


Abstract
We use the noisy-channel theory of human sentence comprehension to develop an incremental processing cost model that unifies and extends key features of expectation-based and memory-based models. In this model, which we call noisy-context surprisal, the processing cost of a word is the surprisal of the word given a noisy representation of the preceding context. We show that this model accounts for an outstanding puzzle in sentence comprehension, language-dependent structural forgetting effects (Gibson and Thomas, 1999; Vasishth et al., 2010; Frank et al., 2016), which are previously not well modeled by either expectation-based or memory-based approaches. Additionally, we show that this model derives and generalizes locality effects (Gibson, 1998; Demberg and Keller, 2008), a signature prediction of memory-based models. We give corpus-based evidence for a key assumption in this derivation.
Anthology ID:
E17-1065
Volume:
Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 1, Long Papers
Month:
April
Year:
2017
Address:
Valencia, Spain
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
688–698
Language:
URL:
https://aclanthology.org/E17-1065
DOI:
Bibkey:
Cite (ACL):
Richard Futrell and Roger Levy. 2017. Noisy-context surprisal as a human sentence processing cost model. In Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 1, Long Papers, pages 688–698, Valencia, Spain. Association for Computational Linguistics.
Cite (Informal):
Noisy-context surprisal as a human sentence processing cost model (Futrell & Levy, EACL 2017)
Copy Citation:
PDF:
https://preview.aclanthology.org/update-css-js/E17-1065.pdf