Abstract
Adapters have emerged as a parameter-efficient Transformer-based framework for cross-lingual transfer by inserting lightweight language-specific modules (language adapters) and task-specific modules (task adapters) within pretrained multilingual models. Zero-shot transfer is enabled by pairing the language adapter in the target language with an appropriate task adapter in a source language. If our target languages are known apriori, we explore how zero-shot transfer can be further improved within the adapter framework by utilizing unlabeled text during task-specific finetuning. We construct language-specific subspaces using standard linear algebra constructs and selectively project source-language representations into the target language subspace during task-specific finetuning using two schemes. Our experiments on three cross-lingual tasks, Named Entity Recognition (NER), Question Answering (QA) and Natural Language Inference (NLI) yield consistent benefits compared to adapter baselines over a wide variety of target languages with up to 11% relative improvement in NER, 2% relative improvement in QA and 5% relative improvement in NLI.- Anthology ID:
- 2023.acl-short.39
- Volume:
- Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
- Month:
- July
- Year:
- 2023
- Address:
- Toronto, Canada
- Editors:
- Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 449–457
- Language:
- URL:
- https://aclanthology.org/2023.acl-short.39
- DOI:
- 10.18653/v1/2023.acl-short.39
- Cite (ACL):
- Ujan Deb, Ridayesh Parab, and Preethi Jyothi. 2023. Zero-shot Cross-lingual Transfer With Learned Projections Using Unlabeled Target-Language Data. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 449–457, Toronto, Canada. Association for Computational Linguistics.
- Cite (Informal):
- Zero-shot Cross-lingual Transfer With Learned Projections Using Unlabeled Target-Language Data (Deb et al., ACL 2023)
- PDF:
- https://preview.aclanthology.org/proper-vol2-ingestion/2023.acl-short.39.pdf