Over-Generation and Compaction: A Prompting Strategy for Procedural Text Adaptation with Large Language Models

Hyeongsik Kim, Yanheng Xu, Chaoqun Dong, Fei Du


Abstract
Procedural text adaptation—such as modifying recipes or revising instructional guides—has traditionally relied on specialized models extensively fine‐tuned for specific domains. To address the scalability limitations of such approaches, recent research has increasingly turned to general‐purpose large language models (LLMs). However, existing prompting strategies for LLMs often yield superficial or erroneous adaptations due to alignment‐induced biases and the inherent complexity of procedural editing. To overcome these challenges, we propose the Over‐generation‐and‐Compaction (OC) prompting strategy, which first elicits an exhaustive set of procedural details to leverage the model’s latent knowledge, and subsequently compacts them into concise, coherent adaptations. We further introduce Recipe Consistency & Feasibility (RCF), a novel metric for systematically assessing procedural validity and practicality in cooking recipe adaptations. Experiments on public datasets demonstrate that OC significantly improves adaptation consistency and feasibility compared to baseline prompting methods, without the need for additional fine-tuning or curated training resources.
Anthology ID:
2025.findings-emnlp.1052
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
19306–19337
Language:
URL:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.1052/
DOI:
10.18653/v1/2025.findings-emnlp.1052
Bibkey:
Cite (ACL):
Hyeongsik Kim, Yanheng Xu, Chaoqun Dong, and Fei Du. 2025. Over-Generation and Compaction: A Prompting Strategy for Procedural Text Adaptation with Large Language Models. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 19306–19337, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Over-Generation and Compaction: A Prompting Strategy for Procedural Text Adaptation with Large Language Models (Kim et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.1052.pdf
Checklist:
 2025.findings-emnlp.1052.checklist.pdf