Measuring the Symbolic Power of Languages with LLM-based Multilingual Persuasion Simulation

Yin Jou Huang, Fei Cheng


Abstract
Prior studies on the symbolic power of languages have largely relied on surveys or localized experiments, limiting systematic comparison across cultures and domains. In this work, we propose an LLM-based multilingual persuasion simulation framework to quantify the symbolic power of languages through persuasion outcomes. We also introduce a Symbolic Power Index (SPI) that measures how language choice affects persuasion success and efficiency across domains. Experiments show that the LLM-based simulations largely reproduce established sociolinguistic prestige hierarchies tied to institutional authority and global power, especially in domains such as business, finance, education, and technology. These results suggest that LLM-based persuasion simulations offer a scalable, decision-making-driven approach to studying symbolic power in language.
Anthology ID:
2026.latechclfl-1.32
Volume:
Proceedings of the 10th Joint SIGHUM Workshop on Computational Linguistics for Cultural Heritage, Social Sciences, Humanities and Literature 2026
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Diego Alves, Yuri Bizzoni, Stefania Degaetano-Ortlieb, Anna Kazantseva, Janis Pagel, Stan Szpakowicz
Venues:
LaTeCH-CLfL | WS
SIG:
SIGHUM
Publisher:
Association for Computational Linguistics
Note:
Pages:
328–338
Language:
URL:
https://preview.aclanthology.org/ingest-eacl/2026.latechclfl-1.32/
DOI:
Bibkey:
Cite (ACL):
Yin Jou Huang and Fei Cheng. 2026. Measuring the Symbolic Power of Languages with LLM-based Multilingual Persuasion Simulation. In Proceedings of the 10th Joint SIGHUM Workshop on Computational Linguistics for Cultural Heritage, Social Sciences, Humanities and Literature 2026, pages 328–338, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
Measuring the Symbolic Power of Languages with LLM-based Multilingual Persuasion Simulation (Huang & Cheng, LaTeCH-CLfL 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-eacl/2026.latechclfl-1.32.pdf