Culturally-Aware Conversations: A Framework & Benchmark for LLMs

Shreya Havaldar, Young Min Cho, Sunny Rai, Lyle Ungar


Abstract
Existing benchmarks that measure cultural adaptation in LLMs are misaligned with the actual challenges these models face when interacting with users from diverse cultural backgrounds. In this work, we introduce the first framework and benchmark designed to evaluate LLMs in realistic, multicultural conversational settings. Grounded in sociocultural theory, our framework formalizes how linguistic style — a key element of cultural communication — is shaped by situational, relational, and cultural context. We construct a benchmark dataset based on this framework, annotated by culturally diverse raters, and propose a new set of desiderata for cross-cultural evaluation in NLP: conversational framing, stylistic sensitivity, and subjective correctness. We evaluate today’s top LLMs on our benchmark and show that these models struggle with cultural adaptation in a conversational setting.
Anthology ID:
2025.hcinlp-1.18
Volume:
Proceedings of the Fourth Workshop on Bridging Human-Computer Interaction and Natural Language Processing (HCI+NLP)
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Su Lin Blodgett, Amanda Cercas Curry, Sunipa Dev, Siyan Li, Michael Madaio, Jack Wang, Sherry Tongshuang Wu, Ziang Xiao, Diyi Yang
Venues:
HCINLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
220–229
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.hcinlp-1.18/
DOI:
Bibkey:
Cite (ACL):
Shreya Havaldar, Young Min Cho, Sunny Rai, and Lyle Ungar. 2025. Culturally-Aware Conversations: A Framework & Benchmark for LLMs. In Proceedings of the Fourth Workshop on Bridging Human-Computer Interaction and Natural Language Processing (HCI+NLP), pages 220–229, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Culturally-Aware Conversations: A Framework & Benchmark for LLMs (Havaldar et al., HCINLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.hcinlp-1.18.pdf