COUNTDOWN: Contextually Sparse Activation Filtering Out Unnecessary Weights in Down Projection

Jaewon Cheon, Pilsung Kang


Abstract
The growing size of large language models has created significant computational inefficiencies. To address this challenge, sparse activation selectively deactivates non-essential parameters during inference, reducing computational costs in FFNN layers. While existing methods focus on non-linear gating mechanisms, we hypothesize that the sparsity of the FFNN layer lies globally in the form of a linear combination over its internal down projection matrix. Based on this insight, we propose two methods: M-COUNTDOWN, leveraging indirect coefficients, and D-COUNTDOWN, utilizing direct coefficients of the linear combination. Experimental results demonstrate that D-COUNTDOWN can omit 90% of computations with performance loss as low as 5.5% ideally, while M-COUNTDOWN provides a predictor-free solution with up to 29.4% better performance preservation compared to existing methods. Our specialized kernel implementations effectively realize these theoretical gains into substantial real-world acceleration.
Anthology ID:
2025.emnlp-main.1442
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
28381–28397
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1442/
DOI:
Bibkey:
Cite (ACL):
Jaewon Cheon and Pilsung Kang. 2025. COUNTDOWN: Contextually Sparse Activation Filtering Out Unnecessary Weights in Down Projection. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 28381–28397, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
COUNTDOWN: Contextually Sparse Activation Filtering Out Unnecessary Weights in Down Projection (Cheon & Kang, EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1442.pdf
Checklist:
 2025.emnlp-main.1442.checklist.pdf