NormXLogit: The Head-on-Top Never Lies

Sina Abbasi, Mohammad Reza Modarres, Mohammad Taher Pilehvar


Abstract
With new large language models (LLMs) emerging frequently, it is important to consider the potential value of model-agnostic approaches that can provide interpretability across a variety of architectures. While recent advances in LLM interpretability show promise, many rely on complex, model-specific methods with high computational costs. To address these limitations, we propose NormXLogit, a novel technique for assessing the significance of individual input tokens. This method operates based on the input and output representations associated with each token. First, we demonstrate that the norm of word embeddings can be utilized as a measure of token importance. Second, we reveal a significant relationship between a token’s importance and how predictive its representation is of the model’s final output. Extensive analyses indicate that our approach outperforms existing gradient-based methods in terms of faithfulness and offers competitive performance compared to leading architecture-specific techniques.
Anthology ID:
2025.emnlp-main.1769
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
34914–34935
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1769/
DOI:
Bibkey:
Cite (ACL):
Sina Abbasi, Mohammad Reza Modarres, and Mohammad Taher Pilehvar. 2025. NormXLogit: The Head-on-Top Never Lies. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 34914–34935, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
NormXLogit: The Head-on-Top Never Lies (Abbasi et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1769.pdf
Checklist:
 2025.emnlp-main.1769.checklist.pdf