Abstract
Dataset bias has attracted increasing attention recently for its detrimental effect on the generalization ability of fine-tuned models. The current mainstream solution is designing an additional shallow model to pre-identify biased instances. However, such two-stage methods scale up the computational complexity of training process and obstruct valid feature information while mitigating bias.To address this issue, we utilize the representation normalization method which aims at disentangling the correlations between features of encoded sentences. We find it also promising in eliminating the bias problem by providing isotropic data distribution. We further propose Kernel-Whitening, a Nystrom kernel approximation method to achieve more thorough debiasing on nonlinear spurious correlations. Our framework is end-to-end with similar time consumption to fine-tuning. Experiments show that Kernel-Whitening significantly improves the performance of BERT on out-of-distribution datasets while maintaining in-distribution accuracy.- Anthology ID:
- 2022.emnlp-main.275
- Volume:
- Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
- Month:
- December
- Year:
- 2022
- Address:
- Abu Dhabi, United Arab Emirates
- Editors:
- Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 4112–4122
- Language:
- URL:
- https://aclanthology.org/2022.emnlp-main.275
- DOI:
- 10.18653/v1/2022.emnlp-main.275
- Cite (ACL):
- SongYang Gao, Shihan Dou, Qi Zhang, and Xuanjing Huang. 2022. Kernel-Whitening: Overcome Dataset Bias with Isotropic Sentence Embedding. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 4112–4122, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
- Cite (Informal):
- Kernel-Whitening: Overcome Dataset Bias with Isotropic Sentence Embedding (Gao et al., EMNLP 2022)
- PDF:
- https://preview.aclanthology.org/dois-2013-emnlp/2022.emnlp-main.275.pdf