Discrete Prompt Optimization via Constrained Generation for Zero-shot Re-ranker

Sukmin Cho, Soyeong Jeong, Jeong yeon Seo, Jong Park


Abstract
Re-rankers, which order retrieved documents with respect to the relevance score on the given query, have gained attention for the information retrieval (IR) task. Rather than fine-tuning the pre-trained language model (PLM), the large-scale language model (LLM) is utilized as a zero-shot re-ranker with excellent results. While LLM is highly dependent on the prompts, the impact and the optimization of the prompts for the zero-shot re-ranker are not explored yet. Along with highlighting the impact of optimization on the zero-shot re-ranker, we propose a novel discrete prompt optimization method, Constrained Prompt generation (Co-Prompt), with the metric estimating the optimum for re-ranking. Co-Prompt guides the generated texts from PLM toward optimal prompts based on the metric without parameter update. The experimental results demonstrate that Co-Prompt leads to outstanding re-ranking performance against the baselines. Also, Co-Prompt generates more interpretable prompts for humans against other prompt optimization methods.
Anthology ID:
2023.findings-acl.61
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
960–971
Language:
URL:
https://aclanthology.org/2023.findings-acl.61
DOI:
10.18653/v1/2023.findings-acl.61
Bibkey:
Cite (ACL):
Sukmin Cho, Soyeong Jeong, Jeong yeon Seo, and Jong Park. 2023. Discrete Prompt Optimization via Constrained Generation for Zero-shot Re-ranker. In Findings of the Association for Computational Linguistics: ACL 2023, pages 960–971, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Discrete Prompt Optimization via Constrained Generation for Zero-shot Re-ranker (Cho et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2023.findings-acl.61.pdf
Video:
 https://preview.aclanthology.org/emnlp-22-attachments/2023.findings-acl.61.mp4