Abstract
The black-box nature of deep learning models in NLP hinders their widespread application. The research focus has shifted to Hierarchical Attribution (HA) for its ability to model feature interactions. Recent works model non-contiguous combinations with a time-costly greedy search in Eculidean spaces, neglecting underlying linguistic information in feature representations. In this work, we introduce a novel method, namely Poincare Explanation (PE), for modeling feature interactions with hyperbolic spaces in a time efficient manner.Specifically, we take building text hierarchies as finding spanning trees in hyperbolic spaces. First we project the embeddings into hyperbolic spaces to elicit inherit semantic and syntax hierarchical structures. Then we propose a simple yet effective strategy to calculate Shapley score. Finally we build the the hierarchy with proving the constructing process in the projected space could be viewed as building a minimum spanning tree and introduce a time efficient building algorithm. Experimental results demonstrate the effectiveness of our approach. Our code is available at https://anonymous.4open.science/r/PE-747B.- Anthology ID:
- 2024.findings-emnlp.462
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2024
- Month:
- November
- Year:
- 2024
- Address:
- Miami, Florida, USA
- Editors:
- Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 7876–7888
- Language:
- URL:
- https://aclanthology.org/2024.findings-emnlp.462
- DOI:
- 10.18653/v1/2024.findings-emnlp.462
- Cite (ACL):
- Qian Chen, Dongyang Li, Xiaofeng He, Hongzhao Li, and Hongyu Yi. 2024. PE: A Poincare Explanation Method for Fast Text Hierarchy Generation. In Findings of the Association for Computational Linguistics: EMNLP 2024, pages 7876–7888, Miami, Florida, USA. Association for Computational Linguistics.
- Cite (Informal):
- PE: A Poincare Explanation Method for Fast Text Hierarchy Generation (Chen et al., Findings 2024)
- PDF:
- https://preview.aclanthology.org/dois-2013-emnlp/2024.findings-emnlp.462.pdf