Zero-shot Cross-lingual Transfer of Prompt-based Tuning with a Unified Multilingual Prompt

Lianzhe Huang, Shuming Ma, Dongdong Zhang, Furu Wei, Houfeng Wang


Abstract
Prompt-based tuning has been proven effective for pretrained language models (PLMs). While most of the existing work focuses on the monolingual prompts, we study the multilingual prompts for multilingual PLMs, especially in the zero-shot cross-lingual setting. To alleviate the effort of designing different prompts for multiple languages, we propose a novel model that uses a unified prompt for all languages, called UniPrompt. Different from the discrete prompts and soft prompts, the unified prompt is model-based and language-agnostic. Specifically, the unified prompt is initialized by a multilingual PLM to produce language-independent representation, after which is fused with the text input. During inference, the prompts can be pre-computed so that no extra computation cost is needed. To collocate with the unified prompt, we propose a new initialization method for the target label word to further improve the model’s transferability across languages. Extensive experiments show that our proposed methods can significantly outperform the strong baselines across different languages. We release data and code to facilitate future research.
Anthology ID:
2022.emnlp-main.790
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11488–11497
Language:
URL:
https://aclanthology.org/2022.emnlp-main.790
DOI:
10.18653/v1/2022.emnlp-main.790
Bibkey:
Cite (ACL):
Lianzhe Huang, Shuming Ma, Dongdong Zhang, Furu Wei, and Houfeng Wang. 2022. Zero-shot Cross-lingual Transfer of Prompt-based Tuning with a Unified Multilingual Prompt. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 11488–11497, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Zero-shot Cross-lingual Transfer of Prompt-based Tuning with a Unified Multilingual Prompt (Huang et al., EMNLP 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-acl-2023-videos/2022.emnlp-main.790.pdf