Zahra Shakeri
2024
RESPROMPT: Residual Connection Prompting Advances Multi-Step Reasoning in Large Language Models
Song Jiang
|
Zahra Shakeri
|
Aaron Chan
|
Maziar Sanjabi
|
Hamed Firooz
|
Yinglong Xia
|
Bugra Akyildiz
|
Yizhou Sun
|
Jinchao Li
|
Qifan Wang
|
Asli Celikyilmaz
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
Chain-of-thought (CoT) has impressively unlocked the reasoning potential of large language models (LLMs). Yet, it falls short when tackling problems that require multiple reasoning steps. This limitation arises from the complex nature of multi-step reasoning processes: later stages often depend not only on the immediately preceding step, but also on the results from several steps earlier. Such complexities indicate the reasoning process is naturally a graph. The almost linear structure of CoT, however, struggles to capture this complex reasoning graph. To address this challenge, we propose Residual Connection Prompting (ResPrompt), a new prompting strategy that advances multi-step reasoning in LLMs. The core of our idea is to reconstruct the reasoning graph within prompts. We achieve this by integrating necessary connections–links present in reasoning graph but missing in the linear CoT flow–into the prompts. Termed “residual connections”, these links can transform linear CoT into the complex reasoning graphs that multi-step problems entail. On benchmarks across math, sequential, and commonsense domains, ResPrompt demonstrates clear improvements in multi-step reasoning compared with CoT. Through extensive ablation studies and analyses, we pinpoint how to effectively build residual connections and also identify situations where it might be unnecessary.
Search
Co-authors
- Song Jiang 1
- Aaron Chan 1
- Maziar Sanjabi 1
- Hamed Firooz 1
- Yinglong Xia 1
- show all...