Polyglot Prompt: Multilingual Multitask Prompt Training

Jinlan Fu, See-Kiong Ng, Pengfei Liu


Abstract
This paper aims for a potential architectural improvement for multilingual learning and asks: Can different tasks from different languages be modeled in a monolithic framework, i.e. without any task/language-specific module? The benefit of achieving this could open new doors for future multilingual research, including allowing systems trained on low resources to be further assisted by other languages as well as other tasks. We approach this goal by developing a learning framework named Polyglot Prompting to exploit prompting methods for learning a unified semantic space for different languages and tasks with multilingual prompt engineering. We performed a comprehensive evaluation of 6 tasks, namely topic classification, sentiment classification, named entity recognition, question answering, natural language inference, and summarization, covering 24 datasets and 49 languages. The experimental results demonstrated the efficacy of multilingual multitask prompt-based learning and led to inspiring observations. We also present an interpretable multilingual evaluation methodology and show how the proposed framework, multilingual multitask prompt training, works. We release all datasets prompted in the best setting and code.
Anthology ID:
2022.emnlp-main.674
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9919–9935
Language:
URL:
https://aclanthology.org/2022.emnlp-main.674
DOI:
Bibkey:
Cite (ACL):
Jinlan Fu, See-Kiong Ng, and Pengfei Liu. 2022. Polyglot Prompt: Multilingual Multitask Prompt Training. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 9919–9935, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Polyglot Prompt: Multilingual Multitask Prompt Training (Fu et al., EMNLP 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-url/2022.emnlp-main.674.pdf