Multi-Task Supervised Pretraining for Neural Domain Adaptation
Sara Meftah, Nasredine Semmar, Mohamed-Ayoub Tahiri, Youssef Tamaazousti, Hassane Essafi, Fatiha Sadat
Abstract
Two prevalent transfer learning approaches are used in recent works to improve neural networks performance for domains with small amounts of annotated data: Multi-task learning which involves training the task of interest with related auxiliary tasks to exploit their underlying similarities, and Mono-task fine-tuning, where the weights of the model are initialized with the pretrained weights of a large-scale labeled source domain and then fine-tuned with labeled data of the target domain (domain of interest). In this paper, we propose a new approach which takes advantage from both approaches by learning a hierarchical model trained across multiple tasks from a source domain, and is then fine-tuned on multiple tasks of the target domain. Our experiments on four tasks applied to the social media domain show that our proposed approach leads to significant improvements on all tasks compared to both approaches.- Anthology ID:
- 2020.socialnlp-1.8
- Volume:
- Proceedings of the Eighth International Workshop on Natural Language Processing for Social Media
- Month:
- July
- Year:
- 2020
- Address:
- Online
- Editors:
- Lun-Wei Ku, Cheng-Te Li
- Venue:
- SocialNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 61–71
- Language:
- URL:
- https://aclanthology.org/2020.socialnlp-1.8
- DOI:
- 10.18653/v1/2020.socialnlp-1.8
- Cite (ACL):
- Sara Meftah, Nasredine Semmar, Mohamed-Ayoub Tahiri, Youssef Tamaazousti, Hassane Essafi, and Fatiha Sadat. 2020. Multi-Task Supervised Pretraining for Neural Domain Adaptation. In Proceedings of the Eighth International Workshop on Natural Language Processing for Social Media, pages 61–71, Online. Association for Computational Linguistics.
- Cite (Informal):
- Multi-Task Supervised Pretraining for Neural Domain Adaptation (Meftah et al., SocialNLP 2020)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-4/2020.socialnlp-1.8.pdf