PESCO: Prompt-enhanced Self Contrastive Learning for Zero-shot Text Classification

Yau-Shian Wang, Ta-Chung Chi, Ruohong Zhang, Yiming Yang


Abstract
We present PESCO, a novel contrastive learning framework that substantially improves the performance of zero-shot text classification. We formulate text classification as a neural text retrieval problem where each document is treated as a query, and the system learns the mapping from each query to the relevant class labels by (1) adding prompts to enhance label retrieval, and (2) using retrieved labels to enrich the training set in a self-training loop of contrastive learning. PESCO achieves state-of-the-art performance on four benchmark text classification datasets. On DBpedia, we achieve 98.5% accuracy without any labeled data, which is close to the fully-supervised result. Extensive experiments and analyses show all the components of PESCO are necessary for improving the performance of zero-shot text classification.
Anthology ID:
2023.acl-long.832
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14897–14911
Language:
URL:
https://aclanthology.org/2023.acl-long.832
DOI:
10.18653/v1/2023.acl-long.832
Bibkey:
Cite (ACL):
Yau-Shian Wang, Ta-Chung Chi, Ruohong Zhang, and Yiming Yang. 2023. PESCO: Prompt-enhanced Self Contrastive Learning for Zero-shot Text Classification. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 14897–14911, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
PESCO: Prompt-enhanced Self Contrastive Learning for Zero-shot Text Classification (Wang et al., ACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/2023.acl-long.832.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-2/2023.acl-long.832.mp4