Decorrelate Irrelevant, Purify Relevant: Overcome Textual Spurious Correlations from a Feature Perspective

Shihan Dou, Rui Zheng, Ting Wu, SongYang Gao, Junjie Shan, Qi Zhang, Yueming Wu, Xuanjing Huang


Abstract
Natural language understanding (NLU) models tend to rely on spurious correlations (i.e., dataset bias) to achieve high performance on in-distribution datasets but poor performance on out-of-distribution ones. Most of the existing debiasing methods often identify and weaken these samples with biased features (i.e., superficial surface features that cause such spurious correlations). However, down-weighting these samples obstructs the model in learning from the non-biased parts of these samples. To tackle this challenge, in this paper, we propose to eliminate spurious correlations in a fine-grained manner from a feature space perspective. Specifically, we introduce Random Fourier Features and weighted re-sampling to decorrelate the dependencies between features to mitigate spurious correlations. After obtaining decorrelated features, we further design a mutual-information-based method to purify them, which forces the model to learn features that are more relevant to tasks. Extensive experiments on two well-studied NLU tasks demonstrate that our method is superior to other comparative approaches.
Anthology ID:
2022.coling-1.199
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
2278–2287
Language:
URL:
https://aclanthology.org/2022.coling-1.199
DOI:
Bibkey:
Cite (ACL):
Shihan Dou, Rui Zheng, Ting Wu, SongYang Gao, Junjie Shan, Qi Zhang, Yueming Wu, and Xuanjing Huang. 2022. Decorrelate Irrelevant, Purify Relevant: Overcome Textual Spurious Correlations from a Feature Perspective. In Proceedings of the 29th International Conference on Computational Linguistics, pages 2278–2287, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Decorrelate Irrelevant, Purify Relevant: Overcome Textual Spurious Correlations from a Feature Perspective (Dou et al., COLING 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2022.coling-1.199.pdf
Code
 coling2022-depro/depro +  additional community code
Data
MultiNLI