From Surveys to Narratives: Rethinking Cultural Value Adaptation in LLMs
Farid Adilazuarda, Chen Cecilia Liu, Iryna Gurevych, Alham Fikri Aji
Abstract
Adapting cultural values in Large Language Models (LLMs) presents significant challenges, particularly due to biases and data limitations. Previous work aligns LLMs with different cultures using survey data, primarily from the World Values Survey (WVS). However, it remains unclear whether this approach effectively captures cultural nuances or produces distinct cultural representations for tasks like offensiveness classification. In this paper, we systematically investigate WVS-based training for cultural value adaptation and find that relying solely on survey data can homogenize cultural norms and interfere with factual knowledge. To address these issues, we propose augmenting WVS with encyclopedic and scenario-based cultural narratives from Wikipedia and NormAd. Our experiments across multiple cultures show that this approach captures more enhances differentiated cultural values and improves downstream classification performances.- Anthology ID:
- 2025.emnlp-main.912
- Volume:
- Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
- Month:
- November
- Year:
- 2025
- Address:
- Suzhou, China
- Editors:
- Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 18063–18090
- Language:
- URL:
- https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.912/
- DOI:
- Cite (ACL):
- Farid Adilazuarda, Chen Cecilia Liu, Iryna Gurevych, and Alham Fikri Aji. 2025. From Surveys to Narratives: Rethinking Cultural Value Adaptation in LLMs. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 18063–18090, Suzhou, China. Association for Computational Linguistics.
- Cite (Informal):
- From Surveys to Narratives: Rethinking Cultural Value Adaptation in LLMs (Adilazuarda et al., EMNLP 2025)
- PDF:
- https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.912.pdf