Refining Attention for Explainable and Noise-Robust Fact-Checking with Transformers

Jean-Flavien Bussotti, Paolo Papotti


Abstract
In tasks like question answering and fact-checking, models must discern relevant information from extensive corpora in an “open-book” setting. Conventional transformer-based models excel at classifying input data, but (i) often falter due to sensitivity to noise and (ii) lack explainability regarding their decision process. To address these challenges, we introduce ATTUN, a novel transformer architecture designed to enhance model transparency and resilience to noise by refining the attention mechanisms. Our approach involves a dedicated module that directly modifies attention weights, allowing the model to both improve predictions and identify the most relevant sections of input data. We validate our methodology using fact-checking datasets and show promising results in question answering. Experiments demonstrate improvements of up to 51% in F1 score for detecting relevant context, and gains of up to 18% in task accuracy when integrating ATTUN into a model.
Anthology ID:
2025.emnlp-main.1295
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
25487–25499
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1295/
DOI:
Bibkey:
Cite (ACL):
Jean-Flavien Bussotti and Paolo Papotti. 2025. Refining Attention for Explainable and Noise-Robust Fact-Checking with Transformers. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 25487–25499, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Refining Attention for Explainable and Noise-Robust Fact-Checking with Transformers (Bussotti & Papotti, EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1295.pdf
Checklist:
 2025.emnlp-main.1295.checklist.pdf