Revision for Concision: A Constrained Paraphrase Generation Task

Wenchuan Mu, Kwan Hui Lim


Abstract
Academic writing should be concise as concise sentences better keep the readers’ attention and convey meaning clearly. Writing concisely is challenging, for writers often struggle to revise their drafts. We introduce and formulate revising for concision as a natural language processing task at the sentence level. Revising for concision requires algorithms to use only necessary words to rewrite a sentence while preserving its meaning. The revised sentence should be evaluated according to its word choice, sentence structure, and organization. The revised sentence also needs to fulfil semantic retention and syntactic soundness. To aide these efforts, we curate and make available a benchmark parallel dataset that can depict revising for concision. The dataset contains 536 pairs of sentences before and after revising, and all pairs are collected from college writing centres. We also present and evaluate the approaches to this problem, which may assist researchers in this area.
Anthology ID:
2022.tsar-1.6
Volume:
Proceedings of the Workshop on Text Simplification, Accessibility, and Readability (TSAR-2022)
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates (Virtual)
Venue:
TSAR
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
57–76
Language:
URL:
https://aclanthology.org/2022.tsar-1.6
DOI:
Bibkey:
Cite (ACL):
Wenchuan Mu and Kwan Hui Lim. 2022. Revision for Concision: A Constrained Paraphrase Generation Task. In Proceedings of the Workshop on Text Simplification, Accessibility, and Readability (TSAR-2022), pages 57–76, Abu Dhabi, United Arab Emirates (Virtual). Association for Computational Linguistics.
Cite (Informal):
Revision for Concision: A Constrained Paraphrase Generation Task (Mu & Lim, TSAR 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/paclic-22-ingestion/2022.tsar-1.6.pdf
Dataset:
 2022.tsar-1.6.dataset.zip