In-context Mixing (ICM): Code-mixed Prompts for Multilingual LLMs

Bhavani Shankar, Preethi Jyothi, Pushpak Bhattacharyya


Abstract
We introduce a simple and effective prompting technique called in-context mixing (ICM) for effective in-context learning (ICL) with multilingual large language models (MLLMs). With ICM, we modify the few-shot examples within ICL prompts to be intra-sententially code-mixed by randomly swapping content words in the target languages with their English translations. We observe that ICM prompts yield superior performance in NLP tasks such as disfluency correction, grammar error correction and text simplification that demand a close correspondence between the input and output sequences. Significant improvements are observed mainly for low-resource languages that are under-represented during the pretraining and finetuning of MLLMs. We present an extensive set of experiments to analyze when ICM is effective and what design choices contribute towards its effectiveness. ICM works consistently and significantly better than other prompting techniques across models of varying capacity such as mT0-XXL, BloomZ and GPT-4.
Anthology ID:
2024.acl-long.228
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4162–4176
Language:
URL:
https://aclanthology.org/2024.acl-long.228
DOI:
10.18653/v1/2024.acl-long.228
Bibkey:
Cite (ACL):
Bhavani Shankar, Preethi Jyothi, and Pushpak Bhattacharyya. 2024. In-context Mixing (ICM): Code-mixed Prompts for Multilingual LLMs. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 4162–4176, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
In-context Mixing (ICM): Code-mixed Prompts for Multilingual LLMs (Shankar et al., ACL 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/add_acl24_videos/2024.acl-long.228.pdf
Video:
 https://preview.aclanthology.org/add_acl24_videos/2024.acl-long.228.mp4