Intention-Adaptive LLM Fine-Tuning for Text Revision Generation

Zhexiong Liu, Diane Litman


Abstract
Large Language Models (LLMs) have achieved impressive capabilities in various context-based text generation tasks, such as summarization and reasoning; however, their applications in intention-based generation tasks remain underexplored. One such example is revision generation, which requires the generated text to explicitly reflect the writer’s actual intentions. Identifying intentions and generating desirable revisions are challenging due to their complex and diverse nature. Although prior work has employed LLMs to generate revisions with few-shot learning, they struggle with handling entangled multi-intent scenarios. While fine-tuning LLMs using intention-based instructions appears promising, it demands large amounts of annotated data, which is expensive and scarce in the revision community. To address these challenges, we propose Intention-Tuning, an intention-adaptive layer-wise LLM fine-tuning framework that dynamically selects a subset of LLM layers to learn the intentions and subsequently transfers their representations to revision generation. Experimental results suggest that Intention-Tuning is effective and efficient on small revision corpora, outperforming several PEFT baselines.
Anthology ID:
2026.findings-eacl.65
Volume:
Findings of the Association for Computational Linguistics: EACL 2026
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Vera Demberg, Kentaro Inui, Lluís Marquez
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1263–1281
Language:
URL:
https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.65/
DOI:
Bibkey:
Cite (ACL):
Zhexiong Liu and Diane Litman. 2026. Intention-Adaptive LLM Fine-Tuning for Text Revision Generation. In Findings of the Association for Computational Linguistics: EACL 2026, pages 1263–1281, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
Intention-Adaptive LLM Fine-Tuning for Text Revision Generation (Liu & Litman, Findings 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.65.pdf
Checklist:
 2026.findings-eacl.65.checklist.pdf