UDAPTER - Efficient Domain Adaptation Using Adapters
Bhavitvya Malik, Abhinav Ramesh Kashyap, Min-Yen Kan, Soujanya Poria
Abstract
We propose two methods to make unsupervised domain adaptation (UDA) more parameter efficient using adapters – small bottleneck layers interspersed with every layer of the large-scale pre-trained language model (PLM). The first method deconstructs UDA into a two-step process: first by adding a domain adapter to learn domain-invariant information and then by adding a task adapter that uses domain-invariant information to learn task representations in the source domain. The second method jointly learns a supervised classifier while reducing the divergence measure. Compared to strong baselines, our simple methods perform well in natural language inference (MNLI) and the cross-domain sentiment classification task. We even outperform unsupervised domain adaptation methods such as DANN and DSN in sentiment classification, and we are within 0.85% F1 for natural language inference task, by fine-tuning only a fraction of the full model parameters. We release our code at this URL.- Anthology ID:
- 2023.eacl-main.165
- Volume:
- Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics
- Month:
- May
- Year:
- 2023
- Address:
- Dubrovnik, Croatia
- Editors:
- Andreas Vlachos, Isabelle Augenstein
- Venue:
- EACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 2249–2263
- Language:
- URL:
- https://aclanthology.org/2023.eacl-main.165
- DOI:
- 10.18653/v1/2023.eacl-main.165
- Cite (ACL):
- Bhavitvya Malik, Abhinav Ramesh Kashyap, Min-Yen Kan, and Soujanya Poria. 2023. UDAPTER - Efficient Domain Adaptation Using Adapters. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, pages 2249–2263, Dubrovnik, Croatia. Association for Computational Linguistics.
- Cite (Informal):
- UDAPTER - Efficient Domain Adaptation Using Adapters (Malik et al., EACL 2023)
- PDF:
- https://preview.aclanthology.org/emnlp22-frontmatter/2023.eacl-main.165.pdf