Abstract
The cognitive mechanisms needed to account for the English past tense have long been a subject of debate in linguistics and cognitive science. Neural network models were proposed early on, but were shown to have clear flaws. Recently, however, Kirov and Cotterell (2018) showed that modern encoder-decoder (ED) models overcome many of these flaws. They also presented evidence that ED models demonstrate humanlike performance in a nonce-word task. Here, we look more closely at the behaviour of their model in this task. We find that (1) the model exhibits instability across multiple simulations in terms of its correlation with human data, and (2) even when results are aggregated across simulations (treating each simulation as an individual human participant), the fit to the human data is not strong—worse than an older rule-based model. These findings hold up through several alternative training regimes and evaluation measures. Although other neural architectures might do better, we conclude that there is still insufficient evidence to claim that neural nets are a good cognitive model for this task.- Anthology ID:
- P19-1376
- Volume:
- Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
- Month:
- July
- Year:
- 2019
- Address:
- Florence, Italy
- Editors:
- Anna Korhonen, David Traum, Lluís Màrquez
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 3868–3877
- Language:
- URL:
- https://aclanthology.org/P19-1376
- DOI:
- 10.18653/v1/P19-1376
- Cite (ACL):
- Maria Corkery, Yevgen Matusevych, and Sharon Goldwater. 2019. Are we there yet? Encoder-decoder neural networks as cognitive models of English past tense inflection. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 3868–3877, Florence, Italy. Association for Computational Linguistics.
- Cite (Informal):
- Are we there yet? Encoder-decoder neural networks as cognitive models of English past tense inflection (Corkery et al., ACL 2019)
- PDF:
- https://preview.aclanthology.org/ingest-acl-2023-videos/P19-1376.pdf