Abstract
Intermediate-task transfer can benefit a wide range of NLP tasks with properly selected source datasets. However, it is computationally infeasible to experiment with all intermediate transfer combinations, making choosing a useful source task a challenging problem. In this paper, we anticipate that task-specific parameters updated in parameter-efficient tuning methods are likely to encode task-specific information. Therefore, such parameters can be predictive for inter-task transferability. Thus, we propose to exploit these efficiently tuned parameters as off-the-shelf task embeddings for the efficient selection of source datasets for intermediate-task transfer. We experiment with 11 text classification tasks and 11 question answering tasks. Experimental results show that our approach consistently outperforms existing inter-task transferability prediction methods while being conceptually simple and computationally efficient. Our analysis also reveals that the ability of efficiently tuned parameters on transferability prediction is disentangled with their in-task performance. This allows us to use parameters from early checkpoints as task embeddings to further improve efficiency.- Anthology ID:
- 2022.emnlp-main.334
- Volume:
- Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
- Month:
- December
- Year:
- 2022
- Address:
- Abu Dhabi, United Arab Emirates
- Editors:
- Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 5007–5014
- Language:
- URL:
- https://aclanthology.org/2022.emnlp-main.334
- DOI:
- 10.18653/v1/2022.emnlp-main.334
- Cite (ACL):
- Wangchunshu Zhou, Canwen Xu, and Julian McAuley. 2022. Efficiently Tuned Parameters Are Task Embeddings. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 5007–5014, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
- Cite (Informal):
- Efficiently Tuned Parameters Are Task Embeddings (Zhou et al., EMNLP 2022)
- PDF:
- https://preview.aclanthology.org/add_acl24_videos/2022.emnlp-main.334.pdf