Learning to Plan by Updating Natural Language
Yiduo Guo, Yaobo Liang, Chenfei Wu, Wenshan Wu, Dongyan Zhao, Nan Duan
Abstract
Large Language Models (LLMs) have shown remarkable performance in various basic natural language tasks. For completing the complex task, we still need a plan for the task to guide LLMs to generate the specific solutions step by step. LLMs can directly generate task plans, but these plans may still contain factual errors or are incomplete. A high-quality task plan contains correct step-by-step solutions for solving all situations and behavioral instructions for avoiding mistakes. To obtain it, we propose the Learning to Plan method, which involves two phases: (1) In the first learning task plan phase, it iteratively updates the task plan with new step-by-step solutions and behavioral instructions, which are obtained by prompting LLMs to derive from training error feedback. (2) In the subsequent test phase, the LLM uses the learned task plan to guide the inference of LLM on the test set. We demonstrate the effectiveness of our method on the five different reasoning type tasks (8 datasets). Further, our analysis experiment shows that the task plan learned by one LLM can directly guide another LLM to improve its performance, which reveals a new transfer learning paradigm.- Anthology ID:
- 2024.findings-emnlp.589
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2024
- Month:
- November
- Year:
- 2024
- Address:
- Miami, Florida, USA
- Editors:
- Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 10062–10098
- Language:
- URL:
- https://aclanthology.org/2024.findings-emnlp.589
- DOI:
- 10.18653/v1/2024.findings-emnlp.589
- Cite (ACL):
- Yiduo Guo, Yaobo Liang, Chenfei Wu, Wenshan Wu, Dongyan Zhao, and Nan Duan. 2024. Learning to Plan by Updating Natural Language. In Findings of the Association for Computational Linguistics: EMNLP 2024, pages 10062–10098, Miami, Florida, USA. Association for Computational Linguistics.
- Cite (Informal):
- Learning to Plan by Updating Natural Language (Guo et al., Findings 2024)
- PDF:
- https://preview.aclanthology.org/landing_page/2024.findings-emnlp.589.pdf