Enhancing Chain-of-Thought Reasoning via Neuron Activation Differential Analysis
Yiru Tang, Kun Zhou, Yingqian Min, Xin Zhao, Jing Sha, Zhichao Sheng, Shijin Wang
Abstract
Despite the impressive chain-of-thought(CoT) reasoning ability of large language models (LLMs), its underlying mechanisms remains unclear. In this paper, we explore the inner workings of LLM’s CoT ability via the lens of neurons in the feed-forward layers. We propose an efficient method to identify reasoning-critical neurons by analyzing their activation patterns under reasoning chains of varying quality. Based on it, we devise a rather simple intervention method that directly stimulates these reasoning-critical neurons, to guide the generation of high-quality reasoning chains. Extended experiments validate the effectiveness of our method and demonstrate the critical role these identified neurons play in CoT reasoning.- Anthology ID:
- 2025.emnlp-main.817
- Volume:
- Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
- Month:
- November
- Year:
- 2025
- Address:
- Suzhou, China
- Editors:
- Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 16162–16170
- Language:
- URL:
- https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.817/
- DOI:
- Cite (ACL):
- Yiru Tang, Kun Zhou, Yingqian Min, Xin Zhao, Jing Sha, Zhichao Sheng, and Shijin Wang. 2025. Enhancing Chain-of-Thought Reasoning via Neuron Activation Differential Analysis. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 16162–16170, Suzhou, China. Association for Computational Linguistics.
- Cite (Informal):
- Enhancing Chain-of-Thought Reasoning via Neuron Activation Differential Analysis (Tang et al., EMNLP 2025)
- PDF:
- https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.817.pdf