CausalAbstain: Enhancing Multilingual LLMs with Causal Reasoning for Trustworthy Abstention

Yuxi Sun, Aoqi Zuo, Wei Gao, Jing Ma


Abstract
Large Language Models (LLMs) often exhibit knowledge disparities across languages. Encouraging LLMs to abstain when faced with knowledge gaps is a promising strategy to reduce hallucinations in multilingual settings. Current abstention strategies for multilingual scenarios primarily rely on generating feedback in various languages using LLMs and performing self-reflection. However, these methods can be adversely impacted by inaccuracies and biases in the generated feedback. To address this, from a causal perspective, we introduce CausalAbstain, a method that helps LLMs determine whether to utilize multiple generated feedback responses and how to identify the most useful ones. Extensive experiments demonstrate that CausalAbstain effectively selects helpful feedback and enhances abstention decisions with interpretability in both native language (Casual-native) and multilingual (Causal-multi) settings, outperforming strong baselines on two benchmark datasets covering encyclopedic and commonsense knowledge QA tasks.
Anthology ID:
2025.findings-acl.723
Volume:
Findings of the Association for Computational Linguistics: ACL 2025
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14060–14076
Language:
URL:
https://preview.aclanthology.org/mtsummit-25-ingestion/2025.findings-acl.723/
DOI:
10.18653/v1/2025.findings-acl.723
Bibkey:
Cite (ACL):
Yuxi Sun, Aoqi Zuo, Wei Gao, and Jing Ma. 2025. CausalAbstain: Enhancing Multilingual LLMs with Causal Reasoning for Trustworthy Abstention. In Findings of the Association for Computational Linguistics: ACL 2025, pages 14060–14076, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
CausalAbstain: Enhancing Multilingual LLMs with Causal Reasoning for Trustworthy Abstention (Sun et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/mtsummit-25-ingestion/2025.findings-acl.723.pdf