Generalized Attention Flow: Feature Attribution for Transformer Models via Maximum Flow

Behrooz Azarkhalili, Maxwell W. Libbrecht


Abstract
This paper introduces Generalized Attention Flow (GAF), a novel feature attribution method for Transformer-based models to address the limitations of current approaches. By extending Attention Flow and replacing attention weights with the generalized Information Tensor, GAF integrates attention weights, their gradients, the maximum flow problem, and the barrier method to enhance the performance of feature attributions. The proposed method exhibits key theoretical properties and mitigates the shortcomings of prior techniques that rely solely on simple aggregation of attention weights. Our comprehensive benchmarking on sequence classification tasks demonstrates that a specific variant of GAF consistently outperforms state-of-the-art feature attribution methods in most evaluation settings, providing a more reliable interpretation of Transformer model outputs.
Anthology ID:
2025.acl-long.980
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
19954–19974
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.980/
DOI:
Bibkey:
Cite (ACL):
Behrooz Azarkhalili and Maxwell W. Libbrecht. 2025. Generalized Attention Flow: Feature Attribution for Transformer Models via Maximum Flow. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 19954–19974, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Generalized Attention Flow: Feature Attribution for Transformer Models via Maximum Flow (Azarkhalili & Libbrecht, ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.980.pdf