Low-Resource Languages LLM Disinformation is Within Reach: The Case of Walliserdeutsch

Andrei Kucharavy, Sherine Seppey, Cyril Vallez, Dimitri Percia David, Ljiljana Dolamic


Abstract
LLM-augmented online disinformation is of particular concern for low-resource languages, given their prior limited exposure to it. While current LLMs lack fluidity in such languages, their multilingual and emerging capabilities can potentially still be leveraged.In this paper, we investigate whether a moderately sophisticated attacker can leverage such capabilities and perform an impersonation attack in the Walliserdeutsch dialect, a low-resource (100k speakers) Swiss German Highest Allemanic dialect that is generally non-intelligible to both Standard German and other Swiss German dialects speakers and presents considerable within-dialect variability.We show that while a standard few-shot learning prompting of SotA LLMs, even by native Walliserdeutsch speakers, yields easily human-detectable texts, an expert attacker performing a PEFT on a small SotA LLM is partially able to perform such an impersonation with minimal resources, even if the fine-tuned LLM does not advertise any capabilities in Germanic languages. With Walliserdeutsch presenting many features of low-resource languages and dialects, our results suggest that LLM-augmented disinformation is within reach for low-resource languages, highlighting the urgency of LLM detectability research in low-resource languages.
Anthology ID:
2025.findings-emnlp.1396
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
25613–25625
Language:
URL:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.1396/
DOI:
10.18653/v1/2025.findings-emnlp.1396
Bibkey:
Cite (ACL):
Andrei Kucharavy, Sherine Seppey, Cyril Vallez, Dimitri Percia David, and Ljiljana Dolamic. 2025. Low-Resource Languages LLM Disinformation is Within Reach: The Case of Walliserdeutsch. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 25613–25625, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Low-Resource Languages LLM Disinformation is Within Reach: The Case of Walliserdeutsch (Kucharavy et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.1396.pdf
Checklist:
 2025.findings-emnlp.1396.checklist.pdf