Effectiveness of Pre-training for Few-shot Intent Classification
Haode Zhang, Yuwei Zhang, Li-Ming Zhan, Jiaxin Chen, Guangyuan Shi, Xiao-Ming Wu, Albert Y.S. Lam
Abstract
This paper investigates the effectiveness of pre-training for few-shot intent classification. While existing paradigms commonly further pre-train language models such as BERT on a vast amount of unlabeled corpus, we find it highly effective and efficient to simply fine-tune BERT with a small set of labeled utterances from public datasets. Specifically, fine-tuning BERT with roughly 1,000 labeled data yields a pre-trained model – IntentBERT, which can easily surpass the performance of existing pre-trained models for few-shot intent classification on novel domains with very different semantics. The high effectiveness of IntentBERT confirms the feasibility and practicality of few-shot intent detection, and its high generalization ability across different domains suggests that intent classification tasks may share a similar underlying structure, which can be efficiently learned from a small set of labeled data. The source code can be found at https://github.com/hdzhang-code/IntentBERT.- Anthology ID:
- 2021.findings-emnlp.96
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2021
- Month:
- November
- Year:
- 2021
- Address:
- Punta Cana, Dominican Republic
- Editors:
- Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
- Venue:
- Findings
- SIG:
- SIGDAT
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 1114–1120
- Language:
- URL:
- https://aclanthology.org/2021.findings-emnlp.96
- DOI:
- 10.18653/v1/2021.findings-emnlp.96
- Cite (ACL):
- Haode Zhang, Yuwei Zhang, Li-Ming Zhan, Jiaxin Chen, Guangyuan Shi, Xiao-Ming Wu, and Albert Y.S. Lam. 2021. Effectiveness of Pre-training for Few-shot Intent Classification. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 1114–1120, Punta Cana, Dominican Republic. Association for Computational Linguistics.
- Cite (Informal):
- Effectiveness of Pre-training for Few-shot Intent Classification (Zhang et al., Findings 2021)
- PDF:
- https://preview.aclanthology.org/emnlp-22-attachments/2021.findings-emnlp.96.pdf
- Data
- BANKING77, HINT3, HWU64