CLIPText: A New Paradigm for Zero-shot Text Classification

Libo Qin, Weiyun Wang, Qiguang Chen, Wanxiang Che


Abstract
While CLIP models are useful for zero-shot vision-and-language (VL) tasks or computer vision tasks, little attention has been paid to the application of CLIP for language tasks. Intuitively, CLIP model have a rich representation pre-trained with natural language supervision, in which we argue that it is useful for language tasks. Hence, this work bridge this gap by investigating a CLIP model for zero-shot text classification. Specifically, we introduce CLIPText, a novel paradigm for zero-shot text classification, which reformulates zero-shot text classification into a text-image matching problem that CLIP can be applied to. In addition, we further incorporate prompt into CLIPText (Prompt-CLIPText) to better derive knowledge from CLIP. Experimental results on seven publicly available zero-shot text classification datasets show that both CLIPText and Prompt-CLIPText attain promising performance. Besides, extensive analysis further verifies that knowledge from CLIP can benefit zero-shot text classification task. We hope this work can attract more breakthroughs on applying VL pre-trained models for language tasks.
Anthology ID:
2023.findings-acl.69
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1077–1088
Language:
URL:
https://aclanthology.org/2023.findings-acl.69
DOI:
10.18653/v1/2023.findings-acl.69
Bibkey:
Cite (ACL):
Libo Qin, Weiyun Wang, Qiguang Chen, and Wanxiang Che. 2023. CLIPText: A New Paradigm for Zero-shot Text Classification. In Findings of the Association for Computational Linguistics: ACL 2023, pages 1077–1088, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
CLIPText: A New Paradigm for Zero-shot Text Classification (Qin et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2023.findings-acl.69.pdf