Hopscotch: Discovering and Skipping Redundancies in Language Models

Mustafa Eyceoz, Nikhil Shivakumar Nayak, Hao Wang, Ligong Han, Akash Srivastava


Abstract
Modern causal language models stack many attention blocks to improve performance, but not all blocks are necessary for every task. We propose Hopscotch, a simple yet effective method that identifies and skips attention blocks with least contributions to a task and adapts to preserve output quality. Hopscotch jointly optimizes which blocks to skip and how to scale the outputs of the remaining layers. By introducing lightweight, trainable scaling parameters to attention and MLP blocks, it mitigates distribution shifts in hidden states caused by removing attention blocks. Hopscotch does not modify model weights or require access to pretraining or instruction-tuning data, and is compatible with existing model compression techniques. When applied to Llama-3.1-8B and Qwen-2.5-7B, Hopscotch achieves less than a 2% drop in performance even after skipping four attention blocks.
Anthology ID:
2025.findings-emnlp.861
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15904–15913
Language:
URL:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.861/
DOI:
10.18653/v1/2025.findings-emnlp.861
Bibkey:
Cite (ACL):
Mustafa Eyceoz, Nikhil Shivakumar Nayak, Hao Wang, Ligong Han, and Akash Srivastava. 2025. Hopscotch: Discovering and Skipping Redundancies in Language Models. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 15904–15913, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Hopscotch: Discovering and Skipping Redundancies in Language Models (Eyceoz et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.861.pdf
Checklist:
 2025.findings-emnlp.861.checklist.pdf