IG-Pruning: Input-Guided Block Pruning for Large Language Models

Kangyu Qiao, Shaolei Zhang, Yang Feng


Abstract
With the growing computational demands of large language models (LLMs), efficient inference has become increasingly critical for practical deployment. Depth pruning has emerged as a promising approach for reducing the computational costs of large language models by removing transformer layers. However, existing methods typically rely on fixed block masks, which can lead to suboptimal performance across different tasks and inputs. In this paper, we propose IG-Pruning, a novel input-aware block-wise pruning method that dynamically selects layer masks at inference time. Our approach consists of two stages: (1) Discovering diverse mask candidates through semantic clustering and L0 optimization, and (2) Implementing efficient dynamic pruning without the need for extensive training. Experimental results demonstrate that our method consistently outperforms state-of-the-art static depth pruning methods, making it particularly suitable for resource-constrained deployment scenarios.
Anthology ID:
2025.emnlp-main.537
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10629–10640
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.537/
DOI:
Bibkey:
Cite (ACL):
Kangyu Qiao, Shaolei Zhang, and Yang Feng. 2025. IG-Pruning: Input-Guided Block Pruning for Large Language Models. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 10629–10640, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
IG-Pruning: Input-Guided Block Pruning for Large Language Models (Qiao et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.537.pdf
Checklist:
 2025.emnlp-main.537.checklist.pdf