Meta Fine-Tuning Neural Language Models for Multi-Domain Text Mining

Chengyu Wang, Minghui Qiu, Jun Huang, Xiaofeng He


Abstract
Pre-trained neural language models bring significant improvement for various NLP tasks, by fine-tuning the models on task-specific training sets. During fine-tuning, the parameters are initialized from pre-trained models directly, which ignores how the learning process of similar NLP tasks in different domains is correlated and mutually reinforced. In this paper, we propose an effective learning procedure named Meta Fine-Tuning (MFT), serving as a meta-learner to solve a group of similar NLP tasks for neural language models. Instead of simply multi-task training over all the datasets, MFT only learns from typical instances of various domains to acquire highly transferable knowledge. It further encourages the language model to encode domain-invariant representations by optimizing a series of novel domain corruption loss functions. After MFT, the model can be fine-tuned for each domain with better parameter initializations and higher generalization ability. We implement MFT upon BERT to solve several multi-domain text mining tasks. Experimental results confirm the effectiveness of MFT and its usefulness for few-shot learning.
Anthology ID:
2020.emnlp-main.250
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Editors:
Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3094–3104
Language:
URL:
https://aclanthology.org/2020.emnlp-main.250
DOI:
10.18653/v1/2020.emnlp-main.250
Bibkey:
Cite (ACL):
Chengyu Wang, Minghui Qiu, Jun Huang, and Xiaofeng He. 2020. Meta Fine-Tuning Neural Language Models for Multi-Domain Text Mining. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 3094–3104, Online. Association for Computational Linguistics.
Cite (Informal):
Meta Fine-Tuning Neural Language Models for Multi-Domain Text Mining (Wang et al., EMNLP 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-1/2020.emnlp-main.250.pdf
Code
 additional community code
Data
MultiNLI