PerturbScore: Connecting Discrete and Continuous Perturbations in NLP

Linyang Li, Ke Ren, Yunfan Shao, Pengyu Wang, Xipeng Qiu


Abstract
With the rapid development of neural network applications in NLP, model robustness problem is gaining more attention. Different from computer vision, the discrete nature of texts makes it more challenging to explore robustness in NLP. Therefore, in this paper, we aim to connect discrete perturbations with continuous perturbations, therefore we can use such connections as a bridge to help understand discrete perturbations in NLP models. Specifically, we first explore how to connect and measure the correlation between discrete perturbations and continuous perturbations. Then we design a regression task as a PerturbScore to learn the correlation automatically. Through experimental results, we find that we can build a connection between discrete and continuous perturbations and use the proposed PerturbScore to learn such correlation, surpassing previous methods used in discrete perturbation measuring. Further, the proposed PerturbScore can be well generalized to different datasets, perturbation methods, indicating that we can use it as a powerful tool to study model robustness in NLP.
Anthology ID:
2023.findings-emnlp.442
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6638–6648
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.442
DOI:
10.18653/v1/2023.findings-emnlp.442
Bibkey:
Cite (ACL):
Linyang Li, Ke Ren, Yunfan Shao, Pengyu Wang, and Xipeng Qiu. 2023. PerturbScore: Connecting Discrete and Continuous Perturbations in NLP. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 6638–6648, Singapore. Association for Computational Linguistics.
Cite (Informal):
PerturbScore: Connecting Discrete and Continuous Perturbations in NLP (Li et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/2023.findings-emnlp.442.pdf