FAD-X: Fusing Adapters for Cross-lingual Transfer to Low-Resource Languages

Jaeseong Lee, Seung-won Hwang, Taesup Kim


Abstract
Adapter-based tuning, by adding light-weight adapters to multilingual pretrained language models (mPLMs), selectively updates language-specific parameters to adapt to a new language, instead of finetuning all shared weights. This paper explores an effective way to leverage a public pool of pretrained language adapters, to overcome resource imbalances for low-resource languages (LRLs). Specifically, our research questions are, whether pretrained adapters can be composed, to complement or replace LRL adapters. While composing adapters for multi-task learning setting has been studied, the same question for LRLs has remained largely unanswered. To answer this question, we study how to fuse adapters across languages and tasks, then validate how our proposed fusion adapter, namely FAD-X, can enhance a cross-lingual transfer from pretrained adapters, for well-known named entity recognition and classification benchmarks.
Anthology ID:
2022.aacl-short.8
Volume:
Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 2: Short Papers)
Month:
November
Year:
2022
Address:
Online only
Editors:
Yulan He, Heng Ji, Sujian Li, Yang Liu, Chua-Hui Chang
Venues:
AACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
57–64
Language:
URL:
https://aclanthology.org/2022.aacl-short.8
DOI:
Bibkey:
Cite (ACL):
Jaeseong Lee, Seung-won Hwang, and Taesup Kim. 2022. FAD-X: Fusing Adapters for Cross-lingual Transfer to Low-Resource Languages. In Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pages 57–64, Online only. Association for Computational Linguistics.
Cite (Informal):
FAD-X: Fusing Adapters for Cross-lingual Transfer to Low-Resource Languages (Lee et al., AACL-IJCNLP 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2022.aacl-short.8.pdf