Sparse Neurons Carry Strong Signals of Question Ambiguity in LLMs

Zhuoxuan Zhang, Jinhao Duan, Edward Kim, Kaidi Xu


Abstract
Ambiguity is pervasive in real-world questions, yet large language models (LLMs) often respond with confident answers rather than seeking clarification. In this work, we show that question ambiguity is linearly encoded in the internal representations of LLMs and can be both detected and controlled at the neuron level. During the model’s pre-filling stage, we identify that a small number of neurons, as few as one, encode question ambiguity information. Probes trained on these Ambiguity-Encoding Neurons (AENs) achieve strong performance on ambiguity detection and generalize across datasets, outperforming prompting-based and representation-based baselines. Layerwise analysis reveals that AENs emerge from shallow layers, suggesting early encoding of ambiguity signals in the model’s processing pipeline. Finally, we show that through manipulating AENs, we can control LLM’s behavior from direct answering to abstention. Our findings reveal that LLMs form compact internal representations of question ambiguity, enabling interpretable and controllable behavior.
Anthology ID:
2025.emnlp-main.813
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
16092–16110
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.813/
DOI:
Bibkey:
Cite (ACL):
Zhuoxuan Zhang, Jinhao Duan, Edward Kim, and Kaidi Xu. 2025. Sparse Neurons Carry Strong Signals of Question Ambiguity in LLMs. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 16092–16110, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Sparse Neurons Carry Strong Signals of Question Ambiguity in LLMs (Zhang et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.813.pdf
Checklist:
 2025.emnlp-main.813.checklist.pdf