KDA: Knowledge Distillation Adapter for Cross-Lingual Transfer

Ta-Bao Nguyen, Nguyen-Phuong Phan, Tung Le, Huy Tien Nguyen


Abstract
State-of-the-art cross-lingual transfer often relies on massive multilingual models, but their prohibitive size and computational cost limit their practicality for low-resource languages. An alternative is to adapt powerful, task-specialized monolingual models, but this presents challenges in bridging the vocabulary and structural gaps between languages. To address this, we propose KDA, a Knowledge Distillation Adapter framework that efficiently adapts a fine-tuned, high-resource monolingual model to a low-resource target language. KDA utilizes knowledge distillation to transfer the source model’s task-solving capabilities to the target language in a parameter-efficient manner. In addition, we introduce a novel adapter architecture that integrates source-language token embeddings while learning new positional embeddings, directly mitigating cross-lingual representational mismatches. Our empirical results on zero-shot transfer for Vietnamese Sentiment Analysis demonstrate that KDA significantly outperforms existing methods, offering a new, effective, and computationally efficient pathway for cross-lingual transfer.
Anthology ID:
2025.inlg-main.8
Volume:
Proceedings of the 18th International Natural Language Generation Conference
Month:
October
Year:
2025
Address:
Hanoi, Vietnam
Editors:
Lucie Flek, Shashi Narayan, Lê Hồng Phương, Jiahuan Pei
Venue:
INLG
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
122–133
Language:
URL:
https://preview.aclanthology.org/author-page-lei-gao-usc/2025.inlg-main.8/
DOI:
Bibkey:
Cite (ACL):
Ta-Bao Nguyen, Nguyen-Phuong Phan, Tung Le, and Huy Tien Nguyen. 2025. KDA: Knowledge Distillation Adapter for Cross-Lingual Transfer. In Proceedings of the 18th International Natural Language Generation Conference, pages 122–133, Hanoi, Vietnam. Association for Computational Linguistics.
Cite (Informal):
KDA: Knowledge Distillation Adapter for Cross-Lingual Transfer (Nguyen et al., INLG 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-page-lei-gao-usc/2025.inlg-main.8.pdf