Shengyu Qiao
2026
Supervised Contrastive Fine-Tuning for Active Few-Shot Learning
Zirui Zhang | Lei Ge | Shengyu Qiao
Proceedings of the Fifteenth Language Resources and Evaluation Conference
Zirui Zhang | Lei Ge | Shengyu Qiao
Proceedings of the Fifteenth Language Resources and Evaluation Conference
Active Few-Shot Learning (AFSL) is an effective paradigm for improving the performance of large language models under limited annotation budgets. To address the inefficiency of conventional fine-tuning objectives in AFSL, this paper proposes a supervised contrastive fine-tuning framework specifically designed for natural language processing (NLP) text classification tasks. By integrating Supervised Contrastive Learning (SCL) with Hard Negative Mining (HNM), the proposed framework optimizes the embedding space through an enhanced hybrid loss function, thereby improving the utilization efficiency of labeled samples. Extensive experiments on five benchmark datasets show that, under a fixed state-of-the-art (SOTA) query strategy, our method consistently outperforms baseline models in text classification performance, and exhibits strong generalizability across different backbone architectures and acquisition functions. These findings demonstrate that optimizing how to learn—through improved learning objectives—provides a complementary direction to existing query strategies in advancing AFSL.