Multilingual Prompting for Improving LLM Generation Diversity

Qihan Wang, Shidong Pan, Tal Linzen, Emily Black


Abstract
Large Language Models (LLMs) are known to lack cultural representation and overall diversity in their generations, from expressing opinions to answering factual questions. To mitigate this problem, we propose multilingual prompting: a prompting method which generates several variations of a base prompt with added cultural and linguistic cues from several cultures, generates responses, and then combines the results. Building on evidence that LLMs have language-specific knowledge, multilingual prompting seeks to increase diversity by activating a broader range of cultural knowledge embedded in model training data. Through experiments across multiple models (GPT-4o, GPT-4o-mini, LLaMA 70B, and LLaMA 8B), we show that multilingual prompting consistently outperforms existing diversity-enhancing techniques such as high-temperature sampling, step-by-step recall, and persona prompting. Further analyses show that the benefits of multilingual prompting vary between high and low resource languages and across model sizes, and that aligning the prompting language with cultural cues reduces hallucination about culturally-specific information.
Anthology ID:
2025.emnlp-main.324
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6378–6400
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.324/
DOI:
Bibkey:
Cite (ACL):
Qihan Wang, Shidong Pan, Tal Linzen, and Emily Black. 2025. Multilingual Prompting for Improving LLM Generation Diversity. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 6378–6400, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Multilingual Prompting for Improving LLM Generation Diversity (Wang et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.324.pdf
Checklist:
 2025.emnlp-main.324.checklist.pdf