Pre-train or Annotate? Domain Adaptation with a Constrained Budget

Fan Bai, Alan Ritter, Wei Xu


Abstract
Recent work has demonstrated that pre-training in-domain language models can boost performance when adapting to a new domain. However, the costs associated with pre-training raise an important question: given a fixed budget, what steps should an NLP practitioner take to maximize performance? In this paper, we study domain adaptation under budget constraints, and approach it as a customer choice problem between data annotation and pre-training. Specifically, we measure the annotation cost of three procedural text datasets and the pre-training cost of three in-domain language models. Then we evaluate the utility of different combinations of pre-training and data annotation under varying budget constraints to assess which combination strategy works best. We find that, for small budgets, spending all funds on annotation leads to the best performance; once the budget becomes large enough, a combination of data annotation and in-domain pre-training works more optimally. We therefore suggest that task-specific data annotation should be part of an economical strategy when adapting an NLP model to a new domain.
Anthology ID:
2021.emnlp-main.409
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5002–5015
Language:
URL:
https://aclanthology.org/2021.emnlp-main.409
DOI:
10.18653/v1/2021.emnlp-main.409
Bibkey:
Cite (ACL):
Fan Bai, Alan Ritter, and Wei Xu. 2021. Pre-train or Annotate? Domain Adaptation with a Constrained Budget. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 5002–5015, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Pre-train or Annotate? Domain Adaptation with a Constrained Budget (Bai et al., EMNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2021.emnlp-main.409.pdf
Video:
 https://preview.aclanthology.org/ingestion-script-update/2021.emnlp-main.409.mp4
Code
 bflashcp3f/procbert