Learning to Detect Noisy Labels Using Model-Based Features

Zhihao Wang, Zongyu Lin, Junjie Wen, Xianxin Chen, Peiqi Liu, Guidong Zheng, Yujun Chen, Zhilin Yang


Abstract
Label noise is ubiquitous in various machine learning scenarios such as self-labeling with model predictions and erroneous data annotation. Many existing approaches are based on heuristics such as sample losses, which might not be flexible enough to achieve optimal solutions. Meta learning based methods address this issue by learning a data selection function, but can be hard to optimize. In light of these pros and cons, we propose SENT (Selection-Enhanced Noisy label Training) that does not rely on meta learning while having the flexibility of being data-driven. SENT transfers the noise distribution to a clean set and trains a model to distinguish noisy labels from clean ones using model-based features. Empirically, on a wide range of tasks including text classification and speech recognition, SENT improves performance over strong baselines under the settings of self-training and label corruption.
Anthology ID:
2022.findings-emnlp.426
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5796–5808
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.426
DOI:
10.18653/v1/2022.findings-emnlp.426
Bibkey:
Cite (ACL):
Zhihao Wang, Zongyu Lin, Junjie Wen, Xianxin Chen, Peiqi Liu, Guidong Zheng, Yujun Chen, and Zhilin Yang. 2022. Learning to Detect Noisy Labels Using Model-Based Features. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 5796–5808, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Learning to Detect Noisy Labels Using Model-Based Features (Wang et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/2022.findings-emnlp.426.pdf