Calibrating Pseudo-Labeling with Class Distribution for Semi-supervised Text Classification

Weiyi Yang, Richong Zhang, Junfan Chen, Jiawei Sheng


Abstract
Semi-supervised text classification (SSTC) aims to train text classification models with few labeled data and massive unlabeled data. Existing studies develop effective pseudo-labeling methods, but they can struggle with unlabeled data that have imbalanced classes mismatched with the labeled data, making the pseudo-labeling biased towards majority classes, resulting in catastrophic error propagation. We believe it is crucial to explicitly estimate the overall class distribution, and use it to calibrate pseudo-labeling to constrain majority classes. To this end, we formulate the pseudo-labeling as an optimal transport (OT) problem, which transports the unlabeled sample distribution to the class distribution. With a memory bank, we dynamically collect both the high-confidence pseudo-labeled data and true labeled data, thus deriving reliable (pseudo-) labels for class distribution estimation. Empirical results on 3 commonly used benchmarks demonstrate that our model is effective and outperforms previous state-of-the-art methods.
Anthology ID:
2025.emnlp-main.658
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13026–13039
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.658/
DOI:
Bibkey:
Cite (ACL):
Weiyi Yang, Richong Zhang, Junfan Chen, and Jiawei Sheng. 2025. Calibrating Pseudo-Labeling with Class Distribution for Semi-supervised Text Classification. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 13026–13039, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Calibrating Pseudo-Labeling with Class Distribution for Semi-supervised Text Classification (Yang et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.658.pdf
Checklist:
 2025.emnlp-main.658.checklist.pdf