Mitigating Copy Bias in In-Context Learning through Neuron Pruning

Ameen Ali Ali, Lior Wolf, Ivan Titov


Abstract
Large language models (LLMs) have demonstrated impressive few-shot in-context learning (ICL) abilities. Still, we show that they are sometimes prone to a ‘copying bias’, where they copy answers from provided examples instead of learning the underlying patterns. In this work, we propose a novel and simple method to mitigate such copying bias. First, we create a synthetic task and use the Integrated Gradients method to identify neurons that prioritize copying over generalization. We demonstrate that pruning these neurons consistently improves performance across a diverse set of ICL tasks, including both single-token and multi-token scenarios, while maintaining or even improving the model’s general capabilities. We also show that our method is applicable across various LLM architectures, including Transformers and State-Space Models, without requiring modifications. In our analysis, we adopt a task-recognition perspective on ICL and examine task vectors (Hendel et al., 2023) induced by the model. We find that pruning enhances the quality of these vectors, suggesting that the pruned neurons previously hindered effective task recognition.
Anthology ID:
2026.findings-eacl.13
Volume:
Findings of the Association for Computational Linguistics: EACL 2026
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Vera Demberg, Kentaro Inui, Lluís Marquez
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
230–251
Language:
URL:
https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.13/
DOI:
Bibkey:
Cite (ACL):
Ameen Ali Ali, Lior Wolf, and Ivan Titov. 2026. Mitigating Copy Bias in In-Context Learning through Neuron Pruning. In Findings of the Association for Computational Linguistics: EACL 2026, pages 230–251, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
Mitigating Copy Bias in In-Context Learning through Neuron Pruning (Ali et al., Findings 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.13.pdf
Checklist:
 2026.findings-eacl.13.checklist.pdf