Human Sentence Processing: Recurrence or Attention?

Danny Merkx, Stefan L. Frank


Abstract
Recurrent neural networks (RNNs) have long been an architecture of interest for computational models of human sentence processing. The recently introduced Transformer architecture outperforms RNNs on many natural language processing tasks but little is known about its ability to model human language processing. We compare Transformer- and RNN-based language models’ ability to account for measures of human reading effort. Our analysis shows Transformers to outperform RNNs in explaining self-paced reading times and neural activity during reading English sentences, challenging the widely held idea that human sentence processing involves recurrent and immediate processing and provides evidence for cue-based retrieval.
Anthology ID:
2021.cmcl-1.2
Volume:
Proceedings of the Workshop on Cognitive Modeling and Computational Linguistics
Month:
June
Year:
2021
Address:
Online
Editors:
Emmanuele Chersoni, Nora Hollenstein, Cassandra Jacobs, Yohei Oseki, Laurent Prévot, Enrico Santus
Venue:
CMCL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12–22
Language:
URL:
https://aclanthology.org/2021.cmcl-1.2
DOI:
10.18653/v1/2021.cmcl-1.2
Bibkey:
Cite (ACL):
Danny Merkx and Stefan L. Frank. 2021. Human Sentence Processing: Recurrence or Attention?. In Proceedings of the Workshop on Cognitive Modeling and Computational Linguistics, pages 12–22, Online. Association for Computational Linguistics.
Cite (Informal):
Human Sentence Processing: Recurrence or Attention? (Merkx & Frank, CMCL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2021.cmcl-1.2.pdf
Code
 DannyMerkx/next_word_prediction