Efficient Beam Search for Large Language Models Using Trie-Based Decoding

Brian J Chan, Mao-xun Huang, Jui-Hung Cheng, Chao-Ting Chen, Hen-Hsen Huang


Abstract
This work presents a novel trie (prefix-tree)-based parallel decoding method that addresses the memory inefficiency of batch-based beam search. By sharing a single KV cache across beams with common prefixes, our approach dramatically reduces memory usage and enables efficient decoding. We evaluated our method across three attention architectures, Multi-Head Attention (Phi-3.5-mini-instruct), Grouped Query Attention (Llama-3.1-8B-Instruct), and Sliding Window Attention (Mistral-Small-24B-Instruct-2501), using CNN/DailyMail for abstractive summarization and HumanEval for code generation. Our experiments demonstrate substantial memory savings (4–8×) and up to 2.4× faster decoding, without compromising generation quality. These results highlight our method’s suitability for memory-constrained environments and large-scale deployments.
Anthology ID:
2025.emnlp-main.748
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14806–14818
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.748/
DOI:
Bibkey:
Cite (ACL):
Brian J Chan, Mao-xun Huang, Jui-Hung Cheng, Chao-Ting Chen, and Hen-Hsen Huang. 2025. Efficient Beam Search for Large Language Models Using Trie-Based Decoding. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 14806–14818, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Efficient Beam Search for Large Language Models Using Trie-Based Decoding (Chan et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.748.pdf
Checklist:
 2025.emnlp-main.748.checklist.pdf