Exploring and Predicting Transferability across NLP Tasks
Tu Vu, Tong Wang, Tsendsuren Munkhdalai, Alessandro Sordoni, Adam Trischler, Andrew Mattarella-Micke, Subhransu Maji, Mohit Iyyer
Abstract
Recent advances in NLP demonstrate the effectiveness of training large-scale language models and transferring them to downstream tasks. Can fine-tuning these models on tasks other than language modeling further improve performance? In this paper, we conduct an extensive study of the transferability between 33 NLP tasks across three broad classes of problems (text classification, question answering, and sequence labeling). Our results show that transfer learning is more beneficial than previously thought, especially when target task data is scarce, and can improve performance even with low-data source tasks that differ substantially from the target task (e.g., part-of-speech tagging transfers well to the DROP QA dataset). We also develop task embeddings that can be used to predict the most transferable source tasks for a given target task, and we validate their effectiveness in experiments controlled for source and target data size. Overall, our experiments reveal that factors such as data size, task and domain similarity, and task complexity all play a role in determining transferability.- Anthology ID:
- 2020.emnlp-main.635
- Volume:
- Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
- Month:
- November
- Year:
- 2020
- Address:
- Online
- Editors:
- Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 7882–7926
- Language:
- URL:
- https://aclanthology.org/2020.emnlp-main.635
- DOI:
- 10.18653/v1/2020.emnlp-main.635
- Cite (ACL):
- Tu Vu, Tong Wang, Tsendsuren Munkhdalai, Alessandro Sordoni, Adam Trischler, Andrew Mattarella-Micke, Subhransu Maji, and Mohit Iyyer. 2020. Exploring and Predicting Transferability across NLP Tasks. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 7882–7926, Online. Association for Computational Linguistics.
- Cite (Informal):
- Exploring and Predicting Transferability across NLP Tasks (Vu et al., EMNLP 2020)
- PDF:
- https://preview.aclanthology.org/add_acl24_videos/2020.emnlp-main.635.pdf
- Code
- tuvuumass/task-transferability
- Data
- BoolQ, CoLA, ComQA, DROP, GLUE, HotpotQA, MRPC, MultiNLI, NewsQA, Penn Treebank, QNLI, SNLI, SQuAD, SST, SST-2, WikiHop