Polyjuice: Generating Counterfactuals for Explaining, Evaluating, and Improving Models

Tongshuang Wu, Marco Tulio Ribeiro, Jeffrey Heer, Daniel Weld


Abstract
While counterfactual examples are useful for analysis and training of NLP models, current generation methods either rely on manual labor to create very few counterfactuals, or only instantiate limited types of perturbations such as paraphrases or word substitutions. We present Polyjuice, a general-purpose counterfactual generator that allows for control over perturbation types and locations, trained by finetuning GPT-2 on multiple datasets of paired sentences. We show that Polyjuice produces diverse sets of realistic counterfactuals, which in turn are useful in various distinct applications: improving training and evaluation on three different tasks (with around 70% less annotation effort than manual generation), augmenting state-of-the-art explanation techniques, and supporting systematic counterfactual error analysis by revealing behaviors easily missed by human experts.
Anthology ID:
2021.acl-long.523
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
August
Year:
2021
Address:
Online
Editors:
Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6707–6723
Language:
URL:
https://aclanthology.org/2021.acl-long.523
DOI:
10.18653/v1/2021.acl-long.523
Bibkey:
Cite (ACL):
Tongshuang Wu, Marco Tulio Ribeiro, Jeffrey Heer, and Daniel Weld. 2021. Polyjuice: Generating Counterfactuals for Explaining, Evaluating, and Improving Models. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 6707–6723, Online. Association for Computational Linguistics.
Cite (Informal):
Polyjuice: Generating Counterfactuals for Explaining, Evaluating, and Improving Models (Wu et al., ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/improve-issue-templates/2021.acl-long.523.pdf
Video:
 https://preview.aclanthology.org/improve-issue-templates/2021.acl-long.523.mp4
Code
 tongshuangwu/polyjuice
Data
GLUEIMDb Movie ReviewsSNLISSTSST-2