Fine-tuning BERT for Low-Resource Natural Language Understanding via Active Learning

Daniel Grießhaber, Johannes Maucher, Ngoc Thang Vu


Abstract
Recently, leveraging pre-trained Transformer based language models in down stream, task specific models has advanced state of the art results in natural language understanding tasks. However, only a little research has explored the suitability of this approach in low resource settings with less than 1,000 training data points. In this work, we explore fine-tuning methods of BERT - a pre-trained Transformer based language model - by utilizing pool-based active learning to speed up training while keeping the cost of labeling new data constant. Our experimental results on the GLUE data set show an advantage in model performance by maximizing the approximate knowledge gain of the model when querying from the pool of unlabeled data. Finally, we demonstrate and analyze the benefits of freezing layers of the language model during fine-tuning to reduce the number of trainable parameters, making it more suitable for low-resource settings.
Anthology ID:
2020.coling-main.100
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
1158–1171
Language:
URL:
https://aclanthology.org/2020.coling-main.100
DOI:
10.18653/v1/2020.coling-main.100
Bibkey:
Cite (ACL):
Daniel Grießhaber, Johannes Maucher, and Ngoc Thang Vu. 2020. Fine-tuning BERT for Low-Resource Natural Language Understanding via Active Learning. In Proceedings of the 28th International Conference on Computational Linguistics, pages 1158–1171, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Fine-tuning BERT for Low-Resource Natural Language Understanding via Active Learning (Grießhaber et al., COLING 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2020.coling-main.100.pdf
Data
GLUEQNLI