Best-k Search Algorithm for Neural Text Generation

Jiacheng Xu, Caiming Xiong, Silvio Savarese, Yingbo Zhou


Abstract
Modern natural language generation paradigms require a decoding strategy to obtain quality sequences out of the model. Beam search yields high-quality but low diversity outputs; stochastic approaches suffer from high variance and sometimes low quality. In this work, we propose a deterministic search algorithm balancing both quality and diversity. We first investigate the vanilla best-first search (BFS) algorithm and then propose the best-k search algorithm. Inspired by BFS, we greedily expand the top k nodes, instead of the first node, to boost efficiency and diversity. Upweighting recently discovered nodes accompanied by heap pruning ensures the completeness of the search procedure. Experiments on four NLG tasks show that best-k search yields more diverse and natural outputs compared to strong baselines, while our approach maintains high text quality. The proposed algorithm is parameter-free, lightweight, efficient, and easy-to-use.
Anthology ID:
2023.acl-long.692
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12385–12401
Language:
URL:
https://aclanthology.org/2023.acl-long.692
DOI:
10.18653/v1/2023.acl-long.692
Bibkey:
Cite (ACL):
Jiacheng Xu, Caiming Xiong, Silvio Savarese, and Yingbo Zhou. 2023. Best-k Search Algorithm for Neural Text Generation. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 12385–12401, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Best-k Search Algorithm for Neural Text Generation (Xu et al., ACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp22-frontmatter/2023.acl-long.692.pdf
Video:
 https://preview.aclanthology.org/emnlp22-frontmatter/2023.acl-long.692.mp4