Tree-of-Prompts: Abstracting Control-Flow for Prompt Optimization

Jihyuk Kim, Shubham Garg, Lahari Poddar, Seung-won Hwang, Chris Hench


Abstract
Prompt optimization (PO) generates prompts to guide Large Language Models (LLMs) in performing tasks. Existing methods, such as PromptAgent, rely on a single static prompt, which struggles with disjoint cases in complex tasks. Although MoP uses multiple prompts, it fails to account for variations in task complexity. Inspired by programmatic control flow, we introduce a nested if-else structure to address both varying similarities and complexities across diverse cases. We propose Tree-of-Prompts (ToP), which implements this structure by recursively expanding child prompts from a parent prompt. Sibling prompts tackle disjoint cases while inheriting shared similarities from their parent, and handle cases more complex than the parent. Evaluated on Gorilla (understanding), MATH (reasoning), and a subset of BBH benchmarks, ToP outperforms PromptAgent and MoP, with improvements of 1.4% and 4.6% over PromptAgent and 3.2% and 4.5% over MoP, when tested with GPT-4o-mini and Llama 3.2-3B, respectively.
Anthology ID:
2025.findings-acl.995
Volume:
Findings of the Association for Computational Linguistics: ACL 2025
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
19436–19459
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.findings-acl.995/
DOI:
Bibkey:
Cite (ACL):
Jihyuk Kim, Shubham Garg, Lahari Poddar, Seung-won Hwang, and Chris Hench. 2025. Tree-of-Prompts: Abstracting Control-Flow for Prompt Optimization. In Findings of the Association for Computational Linguistics: ACL 2025, pages 19436–19459, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Tree-of-Prompts: Abstracting Control-Flow for Prompt Optimization (Kim et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.findings-acl.995.pdf