Mitigating Attention Localization in Small Scale: Self-Attention Refinement via One-step Belief Propagation

Nakyung Lee, Yeongoon Kim, Minhae Oh, Suhwan Kim, Jin Woo Koo, Hyewon Jo, Jungwoo Lee


Abstract
Transformer-based self-attention mechanism serves as the core of modern language models, yet it often suffers from *localization*, where attentions collapse onto a limited subset of tokens and fail to capture long-range dependencies. To address this issue, we propose **Self-Attention One-step Belief Propagation (SAOBP)**, a refinement framework that injects *multi-hop* relationships through a belief propagation process. To interpret and quantify these interactions, we introduce **Global Token Dependency (GTD)** that captures the relative contribution of multi-hop connections within the attention graph. Empirical results indicate that SAOBP helps prevent entropy collapse in deeper layers and adaptively maintains GTD at task-appropriate levels, thereby supporting improvements in model performance. Importantly, we observe competitive gains in small-scale models, highlighting its potential for improving inference quality in resource-constrained scenarios.
Anthology ID:
2025.findings-emnlp.578
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10897–10912
Language:
URL:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.578/
DOI:
10.18653/v1/2025.findings-emnlp.578
Bibkey:
Cite (ACL):
Nakyung Lee, Yeongoon Kim, Minhae Oh, Suhwan Kim, Jin Woo Koo, Hyewon Jo, and Jungwoo Lee. 2025. Mitigating Attention Localization in Small Scale: Self-Attention Refinement via One-step Belief Propagation. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 10897–10912, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Mitigating Attention Localization in Small Scale: Self-Attention Refinement via One-step Belief Propagation (Lee et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.578.pdf
Checklist:
 2025.findings-emnlp.578.checklist.pdf