BARLE: Background-Aware Representation Learning for Background Shift Out-of-Distribution Detection

Hanyu Duan, Yi Yang, Ahmed Abbasi, Kar Yan Tam


Abstract
Machine learning models often suffer from a performance drop when they are applied to out-of-distribution (OOD) samples, i.e., those drawn far away from the training data distribution. Existing OOD detection work mostly focuses on identifying semantic-shift OOD samples, e.g., instances from unseen new classes. However, background-shift OOD detection, which identifies samples with domain or style-change, represents a more practical yet challenging task. In this paper, we propose Background-Aware Representation Learning (BARLE) for background-shift OOD detection in NLP. Specifically, we generate semantics-preserving background-shifted pseudo OOD samples from pretrained masked language models. We then contrast the in-distribution (ID) samples with their pseudo OOD counterparts. Unlike prior semantic-shift OOD detection work that often leverages an external text corpus, BARLE only uses ID data, which is more flexible and cost-efficient. In experiments across several text classification tasks, we demonstrate that BARLE is capable of improving background-shift OOD detection performance while maintaining ID classification accuracy. We further investigate the properties of the generated pseudo OOD samples, uncovering the working mechanism of BARLE.
Anthology ID:
2022.findings-emnlp.53
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
750–764
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.53
DOI:
10.18653/v1/2022.findings-emnlp.53
Bibkey:
Cite (ACL):
Hanyu Duan, Yi Yang, Ahmed Abbasi, and Kar Yan Tam. 2022. BARLE: Background-Aware Representation Learning for Background Shift Out-of-Distribution Detection. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 750–764, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
BARLE: Background-Aware Representation Learning for Background Shift Out-of-Distribution Detection (Duan et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/proper-vol2-ingestion/2022.findings-emnlp.53.pdf