UniICL: An Efficient ICL Framework Unifying Compression, Selection, and Generation

Jun Gao, Qi Lv, Zili Wang, Tianxiang Wu, Ziqiang Cao, Wenjie Li


Abstract
In-context learning (ICL) enhances the reasoning abilities of Large Language Models (LLMs) by prepending a few demonstrations. It motivates researchers to introduce more examples to provide additional contextual information for the generation. However, existing methods show a significant limitation due to the problem of excessive growth in context length which causes a large hardware burden. Additionally, shallow-relevant examples selected by out-off-shelf tools hinder LLMs from capturing useful contextual information for generation. In this paper, to approach these limitations, we propose UniICL, a novel Unified ICL framework that unifies demonstration compression, demonstration selection, and final response generation. Furthermore, to avoid repeated compression of the same demonstration and boost inference efficiency, we design a tailored compression strategy that allows UniICL caching compression results into Demonstration Bank(DB). Extensive out-of-domain evaluations prove the advantages of UniICL in both effectiveness and efficiency.
Anthology ID:
2025.acl-long.24
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
500–510
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.24/
DOI:
Bibkey:
Cite (ACL):
Jun Gao, Qi Lv, Zili Wang, Tianxiang Wu, Ziqiang Cao, and Wenjie Li. 2025. UniICL: An Efficient ICL Framework Unifying Compression, Selection, and Generation. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 500–510, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
UniICL: An Efficient ICL Framework Unifying Compression, Selection, and Generation (Gao et al., ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.24.pdf