Modeling Bottom-up Information Quality during Language Processing

Cui Ding, Yanning Yin, Lena Ann Jäger, Ethan Wilcox


Abstract
Contemporary theories model language processing as integrating both top-down expectations and bottom-up inputs. One major prediction of such models is that the quality of the bottom-up inputs modulates ease of processing—noisy inputs should lead to difficult and effortful comprehension. We test this prediction in the domain of reading. First, we propose an information-theoretic operationalization for the “quality” of bottom-up information as the mutual information (MI) between visual information and word identity. We formalize this prediction in a mathematical model of reading as a Bayesian update. Second, we test our operationalization by comparing participants’ reading times in conditions where words’ information quality has been reduced, either by occluding their top or bottom half, with full words. We collect data in English and Chinese. We then use multimodal language models to estimate the mutual information between visual inputs and words. We use these data to estimate the specific effect of reduced information quality on reading times. Finally, we compare how information is distributed across visual forms. In English and Chinese, the upper half contains more information about word identity than the lower half. However, the asymmetry is more pronounced in English, a pattern which is reflected in the reading times.
Anthology ID:
2025.emnlp-main.592
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11720–11732
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.592/
DOI:
Bibkey:
Cite (ACL):
Cui Ding, Yanning Yin, Lena Ann Jäger, and Ethan Wilcox. 2025. Modeling Bottom-up Information Quality during Language Processing. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 11720–11732, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Modeling Bottom-up Information Quality during Language Processing (Ding et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.592.pdf
Checklist:
 2025.emnlp-main.592.checklist.pdf