Cross-Lingual Retrieval Augmented Prompt for Low-Resource Languages

Ercong Nie, Sheng Liang, Helmut Schmid, Hinrich Schütze


Abstract
Multilingual Pretrained Language Models (MPLMs) perform strongly in cross-lingual transfer. We propose Prompts Augmented by Retrieval Crosslingually (PARC) to improve zero-shot performance on low-resource languages (LRLs) by augmenting the context with prompts consisting of semantically similar sentences retrieved from a high-resource language (HRL). PARC improves zero-shot performance on three downstream tasks (sentiment classification, topic categorization, natural language inference) with multilingual parallel test sets across 10 LRLs covering 6 language families in unlabeled (+5.1%) and labeled settings (+16.3%). PARC also outperforms finetuning by 3.7%. We find a significant positive correlation between cross-lingual transfer performance on one side, and the similarity between high- and low-resource languages as well as the amount of low-resource pretraining data on the other side. A robustness analysis suggests that PARC has the potential to achieve even stronger performance with more powerful MPLMs.
Anthology ID:
2023.findings-acl.528
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8320–8340
Language:
URL:
https://aclanthology.org/2023.findings-acl.528
DOI:
10.18653/v1/2023.findings-acl.528
Bibkey:
Cite (ACL):
Ercong Nie, Sheng Liang, Helmut Schmid, and Hinrich Schütze. 2023. Cross-Lingual Retrieval Augmented Prompt for Low-Resource Languages. In Findings of the Association for Computational Linguistics: ACL 2023, pages 8320–8340, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Cross-Lingual Retrieval Augmented Prompt for Low-Resource Languages (Nie et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/add_acl24_videos/2023.findings-acl.528.pdf
Video:
 https://preview.aclanthology.org/add_acl24_videos/2023.findings-acl.528.mp4