VALUE: Understanding Dialect Disparity in NLU

Caleb Ziems, Jiaao Chen, Camille Harris, Jessica Anderson, Diyi Yang


Abstract
English Natural Language Understanding (NLU) systems have achieved great performances and even outperformed humans on benchmarks like GLUE and SuperGLUE. However, these benchmarks contain only textbook Standard American English (SAE). Other dialects have been largely overlooked in the NLP community. This leads to biased and inequitable NLU systems that serve only a sub-population of speakers. To understand disparities in current models and to facilitate more dialect-competent NLU systems, we introduce the VernAcular Language Understanding Evaluation (VALUE) benchmark, a challenging variant of GLUE that we created with a set of lexical and morphosyntactic transformation rules. In this initial release (V.1), we construct rules for 11 features of African American Vernacular English (AAVE), and we recruit fluent AAVE speakers to validate each feature transformation via linguistic acceptability judgments in a participatory design manner. Experiments show that these new dialectal features can lead to a drop in model performance.
Anthology ID:
2022.acl-long.258
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3701–3720
Language:
URL:
https://aclanthology.org/2022.acl-long.258
DOI:
10.18653/v1/2022.acl-long.258
Bibkey:
Cite (ACL):
Caleb Ziems, Jiaao Chen, Camille Harris, Jessica Anderson, and Diyi Yang. 2022. VALUE: Understanding Dialect Disparity in NLU. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 3701–3720, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
VALUE: Understanding Dialect Disparity in NLU (Ziems et al., ACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2022.acl-long.258.pdf
Video:
 https://preview.aclanthology.org/auto-file-uploads/2022.acl-long.258.mp4
Code
 salt-nlp/value
Data
CoLAGLUEQNLI