Transformers in the loop: Polarity in neural models of language

Lisa Bylinina, Alexey Tikhonov


Abstract
Representation of linguistic phenomena in computational language models is typically assessed against the predictions of existing linguistic theories of these phenomena. Using the notion of polarity as a case study, we show that this is not always the most adequate set-up. We probe polarity via so-called ‘negative polarity items’ (in particular, English ‘any’) in two pre-trained Transformer-based models (BERT and GPT-2). We show that – at least for polarity – metrics derived from language models are more consistent with data from psycholinguistic experiments than linguistic theory predictions. Establishing this allows us to more adequately evaluate the performance of language models and also to use language models to discover new insights into natural language grammar beyond existing linguistic theories. This work contributes to establishing closer ties between psycholinguistic experiments and experiments with language models.
Anthology ID:
2022.acl-long.455
Original:
2022.acl-long.455v1
Version 2:
2022.acl-long.455v2
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6601–6610
Language:
URL:
https://aclanthology.org/2022.acl-long.455
DOI:
10.18653/v1/2022.acl-long.455
Bibkey:
Cite (ACL):
Lisa Bylinina and Alexey Tikhonov. 2022. Transformers in the loop: Polarity in neural models of language. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 6601–6610, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Transformers in the loop: Polarity in neural models of language (Bylinina & Tikhonov, ACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/nodalida-main-page/2022.acl-long.455.pdf
Code
 altsoph/transformers-in-the-loop
Data
Natural sentences that contain *any*Synthetic parallel sentences that contain *any*