Contrast Sets for Stativity of English Verbs in Context

Daniel Chen, Alexis Palmer


Abstract
For the task of classifying verbs in context as dynamic or stative, current models approach human performance, but only for particular data sets. To better understand the performance of such models, and how well they are able to generalize beyond particular test sets, we apply the contrast set (Gardner et al., 2020) methodology to stativity classification. We create nearly 300 contrastive pairs by perturbing test set instances just enough to change their labels from one class to the other, while preserving coherence, meaning, and well-formedness. Contrastive evaluation shows that a model with near-human performance on an in-distribution test set degrades substantially when applied to transformed examples, showing that the stative vs. dynamic classification task is more complex than the model performance might otherwise suggest. Code and data are freely available.
Anthology ID:
2022.coling-1.354
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Editors:
Nicoletta Calzolari, Chu-Ren Huang, Hansaem Kim, James Pustejovsky, Leo Wanner, Key-Sun Choi, Pum-Mo Ryu, Hsin-Hsi Chen, Lucia Donatelli, Heng Ji, Sadao Kurohashi, Patrizia Paggio, Nianwen Xue, Seokhwan Kim, Younggyun Hahm, Zhong He, Tony Kyungil Lee, Enrico Santus, Francis Bond, Seung-Hoon Na
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
4028–4036
Language:
URL:
https://aclanthology.org/2022.coling-1.354
DOI:
Bibkey:
Cite (ACL):
Daniel Chen and Alexis Palmer. 2022. Contrast Sets for Stativity of English Verbs in Context. In Proceedings of the 29th International Conference on Computational Linguistics, pages 4028–4036, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Contrast Sets for Stativity of English Verbs in Context (Chen & Palmer, COLING 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2022.coling-1.354.pdf