Few-shot initializing of Active Learner via Meta-Learning

Zi Long Zhu, Vikrant Yadav, Zubair Afzal, George Tsatsaronis


Abstract
Despite the important evolutions in few-shot and zero-shot learning techniques, domain specific applications still require expert knowledge and significant effort in annotating and labeling a large volume of unstructured textual data. To mitigate this problem, active learning, and meta-learning attempt to reach a high performance with the least amount of labeled data. In this paper, we introduce a novel approach to combine both lines of work by initializing an active learner with meta-learned parameters obtained through meta-training on tasks similar to the target task during active learning. In this approach we use the pre-trained BERT as our text-encoder and meta-learn its parameters with LEOPARD, which extends the model-agnostic meta-learning method by generating task dependent softmax weights to enable learning across tasks with different number of classes. We demonstrate the effectiveness of our method by performing active learning on five natural language understanding tasks and six datasets with five different acquisition functions. We train two different meta-initializations, and we use the pre-trained BERT base initialization as baseline. We observe that our approach performs better than the baseline at low budget, especially when closely related tasks were present during meta-learning. Moreover, our results show that better performance in the initial phase, i.e., with fewer labeled samples, leads to better performance when larger acquisition batches are used. We also perform an ablation study of the proposed method, showing that active learning with only the meta-learned weights is beneficial and adding the meta-learned learning rates and generating the softmax have negative consequences for the performance.
Anthology ID:
2022.findings-emnlp.80
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1117–1133
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.80
DOI:
Bibkey:
Cite (ACL):
Zi Long Zhu, Vikrant Yadav, Zubair Afzal, and George Tsatsaronis. 2022. Few-shot initializing of Active Learner via Meta-Learning. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 1117–1133, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Few-shot initializing of Active Learner via Meta-Learning (Zhu et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2022.findings-emnlp.80.pdf