A Framework for Decoding Event-Related Potentials from Text

Shaorong Yan, Aaron Steven White


Abstract
We propose a novel framework for modeling event-related potentials (ERPs) collected during reading that couples pre-trained convolutional decoders with a language model. Using this framework, we compare the abilities of a variety of existing and novel sentence processing models to reconstruct ERPs. We find that modern contextual word embeddings underperform surprisal-based models but that, combined, the two outperform either on its own.
Anthology ID:
W19-2910
Volume:
Proceedings of the Workshop on Cognitive Modeling and Computational Linguistics
Month:
June
Year:
2019
Address:
Minneapolis, Minnesota
Editors:
Emmanuele Chersoni, Cassandra Jacobs, Alessandro Lenci, Tal Linzen, Laurent Prévot, Enrico Santus
Venue:
CMCL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
86–92
Language:
URL:
https://aclanthology.org/W19-2910
DOI:
10.18653/v1/W19-2910
Bibkey:
Cite (ACL):
Shaorong Yan and Aaron Steven White. 2019. A Framework for Decoding Event-Related Potentials from Text. In Proceedings of the Workshop on Cognitive Modeling and Computational Linguistics, pages 86–92, Minneapolis, Minnesota. Association for Computational Linguistics.
Cite (Informal):
A Framework for Decoding Event-Related Potentials from Text (Yan & White, CMCL 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-1/W19-2910.pdf