Writing Like the Best: Exemplar-Based Expository Text Generation

Yuxiang Liu, Kevin Chen-Chuan Chang


Abstract
We introduce the Exemplar-Based Expository Text Generation task, aiming to generate an expository text on a new topic using an exemplar on a similar topic. Current methods fall short due to their reliance on extensive exemplar data, difficulty in adapting topic-specific content, and issues with long-text coherence. To address these challenges, we propose the concept of Adaptive Imitation and present a novel Recurrent Plan-then-Adapt (RePA) framework. RePA leverages large language models (LLMs) for effective adaptive imitation through a fine-grained plan-then-adapt process. RePA also enables recurrent segment-by-segment imitation, supported by two memory structures that enhance input clarity and output coherence. We also develop task-specific evaluation metrics–imitativeness, adaptiveness, and adaptive-imitativeness–using LLMs as evaluators. Experimental results across our collected three diverse datasets demonstrate that RePA surpasses existing baselines in producing factual, consistent, and relevant texts for this task.
Anthology ID:
2025.acl-long.1250
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
25739–25764
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.1250/
DOI:
Bibkey:
Cite (ACL):
Yuxiang Liu and Kevin Chen-Chuan Chang. 2025. Writing Like the Best: Exemplar-Based Expository Text Generation. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 25739–25764, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Writing Like the Best: Exemplar-Based Expository Text Generation (Liu & Chang, ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.1250.pdf