DAdEE: Unsupervised Domain Adaptation in Early Exit PLMs

Divya Jyoti Bajpai, Manjesh Kumar Hanawal


Abstract
Pre-trained Language Models (PLMs) exhibit good accuracy and generalization ability across various tasks using self-supervision, but their large size results in high inference latency. Early Exit (EE) strategies handle the issue by allowing the samples to exit from classifiers attached to the intermediary layers, but they do not generalize well, as exit classifiers can be sensitive to domain changes. To address this, we propose Unsupervised Domain Adaptation in EE framework (DAdEE) that employs multi-level adaptation using knowledge distillation. DAdEE utilizes GAN-based adversarial adaptation at each layer to achieve domain-invariant representations, reducing the domain gap between the source and target domain across all layers. The attached exits not only speed up inference but also enhance domain adaptation by reducing catastrophic forgetting and mode collapse, making it more suitable for real-world scenarios. Experiments on tasks such as sentiment analysis, entailment classification, and natural language inference demonstrate that DAdEE consistently outperforms not only early exit methods but also various domain adaptation methods under domain shift scenarios. The anonymized source code is available at https://github.com/Div290/DAdEE.
Anthology ID:
2024.findings-emnlp.371
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2024
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6389–6400
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/2024.findings-emnlp.371/
DOI:
10.18653/v1/2024.findings-emnlp.371
Bibkey:
Cite (ACL):
Divya Jyoti Bajpai and Manjesh Kumar Hanawal. 2024. DAdEE: Unsupervised Domain Adaptation in Early Exit PLMs. In Findings of the Association for Computational Linguistics: EMNLP 2024, pages 6389–6400, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
DAdEE: Unsupervised Domain Adaptation in Early Exit PLMs (Bajpai & Hanawal, Findings 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/2024.findings-emnlp.371.pdf