Attention with Dependency Parsing Augmentation for Fine-Grained Attribution

Qiang Ding, Lvzhou Luo, Yixuan Cao, Ping Luo


Abstract
To assist humans in efficiently validating RAG-generated content, developing a fine-grained attribution mechanism that provides supporting evidence from retrieved documents for every answer span is essential. Existing fine-grained attribution methods rely on model-internal similarity metrics between responses and documents, such as saliency scores and hidden state similarity. However, these approaches suffer from either high computational complexity or coarse-grained representations. Additionally, a common problem shared by the previous works is their reliance on decoder-only Transformers, limiting their ability to incorporate contextual information after the target span. To address the above problems, we propose two techniques applicable to all model-internals-based methods. First, we aggregate token-wise evidence through set union operations, preserving the granularity of representations. Second, we enhance the attributor by integrating dependency parsing to enrich the semantic completeness of target spans. For practical implementation, our approach employs attention weights as the similarity metric. Experimental results demonstrate that the proposed method consistently outperforms all prior works.
Anthology ID:
2025.findings-acl.21
Volume:
Findings of the Association for Computational Linguistics: ACL 2025
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venues:
Findings | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
372–387
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.findings-acl.21/
DOI:
Bibkey:
Cite (ACL):
Qiang Ding, Lvzhou Luo, Yixuan Cao, and Ping Luo. 2025. Attention with Dependency Parsing Augmentation for Fine-Grained Attribution. In Findings of the Association for Computational Linguistics: ACL 2025, pages 372–387, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Attention with Dependency Parsing Augmentation for Fine-Grained Attribution (Ding et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.findings-acl.21.pdf