Efficient Many-Shot In-Context Learning with Dynamic Block-Sparse Attention
Emily Xiao, Chin-Jou Li, Yilin Zhang, Graham Neubig, Amanda Bertsch
Abstract
Many-shot in-context learning has recently shown promise as an alternative to finetuning, with the major advantage that the same model can be served for multiple tasks. However, this shifts the computational burden from training-time to inference-time, making deployment of many-shot ICL challenging to justify in-practice. This cost is further increased if a custom demonstration set is retrieved for each inference example. We present Dynamic Block-Sparse Attention, an optimized method for retrieval-based many-shot in-context learning. By combining carefully designed block-sparse attention and retrieval of cached groups of demonstrations, we achieve comparable per-example latency to finetuning while maintaining on average >95% of the best method’s accuracy across strong ICL and finetuning baselines. We hope that this will further enable the deployment of many-shot ICL at scale.- Anthology ID:
- 2025.acl-long.1542
- Volume:
- Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
- Month:
- July
- Year:
- 2025
- Address:
- Vienna, Austria
- Editors:
- Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 31946–31958
- Language:
- URL:
- https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.1542/
- DOI:
- Cite (ACL):
- Emily Xiao, Chin-Jou Li, Yilin Zhang, Graham Neubig, and Amanda Bertsch. 2025. Efficient Many-Shot In-Context Learning with Dynamic Block-Sparse Attention. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 31946–31958, Vienna, Austria. Association for Computational Linguistics.
- Cite (Informal):
- Efficient Many-Shot In-Context Learning with Dynamic Block-Sparse Attention (Xiao et al., ACL 2025)
- PDF:
- https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.1542.pdf