Leveraging Text-to-Text Transformers as Classifier Chain for Few-Shot Multi-Label Classification

Quang Anh Nguyen, Nadi Tomeh, Mustapha Lebbah, Thierry Charnois, Hanane Azzag


Abstract
Multilabel text classification (MLTC) is an essential task in NLP applications. Traditional methods require extensive labeled data and are limited to fixed label sets. Extracting labels by LLMs is more effective and universal, but incurs high computational costs. In this work, we introduce a distillation-based T5 generalist model for zero-shot MLTC and few-shot fine-tuning. Our model accommodates variable label sets with general domain-agnostic pertaining, while modeling dependency between labels. Experiments show that our approach outperforms baselines of similar size on three few-shot tasks.Our code is available at https://anonymous.4open.science/r/t5-multilabel-0C32/README.md
Anthology ID:
2025.emnlp-main.1368
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
26929–26938
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1368/
DOI:
Bibkey:
Cite (ACL):
Quang Anh Nguyen, Nadi Tomeh, Mustapha Lebbah, Thierry Charnois, and Hanane Azzag. 2025. Leveraging Text-to-Text Transformers as Classifier Chain for Few-Shot Multi-Label Classification. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 26929–26938, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Leveraging Text-to-Text Transformers as Classifier Chain for Few-Shot Multi-Label Classification (Nguyen et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1368.pdf
Checklist:
 2025.emnlp-main.1368.checklist.pdf