Information Asymmetry across Language Varieties: A Case Study on Cantonese-Mandarin and Bavarian-German QA

Renhao Pei, Siyao Peng, Verena Blaschke, Robert Litschko, Barbara Plank


Abstract
Large Language Models (LLMs) are becoming a common way for humans to seek knowledge, yet their coverage and reliability vary widely. Especially for local language varieties, there are large asymmetries, e.g., information in local Wikipedia that is absent from the standard variant. However, little is known about how well LLMs perform under such information asymmetry, especially on closely related languages. We manually construct a novel challenge question-answering (QA) dataset that captures knowledge conveyed on a local Wikipedia page, which is absent from their higher-resource counterparts — covering Mandarin Chinese vs. Cantonese and German vs. Bavarian. Our experiments show that LLMs fail to answer questions about information only in local editions of Wikipedia. Providing context from lead sections substantially improves performance, with further gains possible via translation. Our topical, geographic annotations, and stratified evaluations reveal the usefulness of local Wikipedia editions as sources of both regional and global information. These findings raise critical questions about inclusivity and cultural coverage of LLMs.
Anthology ID:
2026.lrec-main.100
Volume:
Proceedings of the Fifteenth Language Resources and Evaluation Conference
Month:
May
Year:
2026
Address:
Palma de Mallorca, Spain
Editors:
Stelios Piperidis, Núria Bel, Henk van den Heuvel, Nancy Ide, Simon Krek, Antonio Toral
Venue:
LREC
SIG:
Publisher:
ELRA Language Resource Association
Note:
Pages:
1280–1302
Language:
URL:
https://preview.aclanthology.org/ingest-lrec/2026.lrec-main.100/
DOI:
Bibkey:
Cite (ACL):
Renhao Pei, Siyao Peng, Verena Blaschke, Robert Litschko, and Barbara Plank. 2026. Information Asymmetry across Language Varieties: A Case Study on Cantonese-Mandarin and Bavarian-German QA. International Conference on Language Resources and Evaluation, main:1280–1302.
Cite (Informal):
Information Asymmetry across Language Varieties: A Case Study on Cantonese-Mandarin and Bavarian-German QA (Pei et al., LREC 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-lrec/2026.lrec-main.100.pdf