TextGenSHAP: Scalable Post-Hoc Explanations in Text Generation with Long Documents

James Enouen, Hootan Nakhost, Sayna Ebrahimi, Sercan Arik, Yan Liu, Tomas Pfister


Abstract
Large language models (LLMs) have attracted great interest in many real-world applications; however, their “black-box” nature necessitates scalable and faithful explanations. Shapley values have matured as an explainability method for deep learning, but extending them to LLMs is difficult due to long input contexts and autoregressive output generation. We introduce , an efficient post-hoc explanation method incorporating LLM-specific techniques, which leads to significant runtime improvements: token-level explanations in minutes not hours, and document-level explanations within seconds. We demonstrate how such explanations can improve end-to-end performance of retrieval augmented generation by localizing important words within long documents and reranking passages collected by retrieval systems. On various open-domain question answering benchmarks, we show TextGenSHAP improves the retrieval recall and prediction accuracy significantly.
Anthology ID:
2024.findings-acl.832
Volume:
Findings of the Association for Computational Linguistics: ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13984–14011
Language:
URL:
https://aclanthology.org/2024.findings-acl.832
DOI:
10.18653/v1/2024.findings-acl.832
Bibkey:
Cite (ACL):
James Enouen, Hootan Nakhost, Sayna Ebrahimi, Sercan Arik, Yan Liu, and Tomas Pfister. 2024. TextGenSHAP: Scalable Post-Hoc Explanations in Text Generation with Long Documents. In Findings of the Association for Computational Linguistics: ACL 2024, pages 13984–14011, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
TextGenSHAP: Scalable Post-Hoc Explanations in Text Generation with Long Documents (Enouen et al., Findings 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/add_acl24_videos/2024.findings-acl.832.pdf
Video:
 https://preview.aclanthology.org/add_acl24_videos/2024.findings-acl.832.mp4