Optimizing Instruction Synthesis: Effective Exploration of Evolutionary Space with Tree Search

Li Chenglin, Qianglong Chen, Zhi Li, FengTao FengTao, Yicheng Li, Hao Chen, Fei Yu, Yin Zhang


Abstract
Instruction tuning is a crucial technique for aligning language models with humans’ actual goals in the real world. Extensive research has highlighted the quality of instruction data is essential for the success of this alignment. However, creating high-quality data manually is labor-intensive and time-consuming, which leads researchers to explore using LLMs to synthesize data. Recent studies have focused on using a stronger LLM to iteratively enhance existing instruction data, showing promising results. Nevertheless, previous work often lacks control over the evolution direction, resulting in high uncertainty in the data synthesis process and low-quality instructions. In this paper, we introduce a general and scalable framework, IDEA-MCTS (Instruction Data Enhancement using Monte Carlo Tree Search), a scalable framework for efficiently synthesizing instructions. With tree search and evaluation models, it can efficiently guide each instruction to evolve into a high-quality form, aiding in instruction fine-tuning. Experimental results show that IDEA-MCTS significantly enhances the seed instruction data, raising the average evaluation scores of quality, diversity, and complexity from 2.19 to 3.81. Furthermore, in open-domain benchmarks, experimental results show that IDEA-MCTS improves the accuracy of real-world instruction-following skills in LLMs by an average of 5% in low-resource settings.
Anthology ID:
2024.findings-emnlp.93
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2024
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1707–1721
Language:
URL:
https://aclanthology.org/2024.findings-emnlp.93
DOI:
10.18653/v1/2024.findings-emnlp.93
Bibkey:
Cite (ACL):
Li Chenglin, Qianglong Chen, Zhi Li, FengTao FengTao, Yicheng Li, Hao Chen, Fei Yu, and Yin Zhang. 2024. Optimizing Instruction Synthesis: Effective Exploration of Evolutionary Space with Tree Search. In Findings of the Association for Computational Linguistics: EMNLP 2024, pages 1707–1721, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Optimizing Instruction Synthesis: Effective Exploration of Evolutionary Space with Tree Search (Chenglin et al., Findings 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2024.findings-emnlp.93.pdf
Software:
 2024.findings-emnlp.93.software.zip
Data:
 2024.findings-emnlp.93.data.zip