PIP: Parse-Instructed Prefix for Syntactically Controlled Paraphrase Generation

Yixin Wan, Kuan-Hao Huang, Kai-Wei Chang


Abstract
Syntactically controlled paraphrase generation requires language models to generate paraphrases for sentences according to specific syntactic structures. Existing fine-tuning methods on this task is costly, as all parameters of the model need to be updated during the training process. Inspired by recent studies on parameter-efficient learning, we propose Parse-Instructed Prefix (PIP), a novel adaptation of prefix-tuning to tune large pre-trained language models on syntactically controlled paraphrase generation task in a low-data setting with significantly less training cost. We introduce two methods to instruct a model’s encoder prefix to capture syntax-related knowledge: direct initiation (PIP-Direct) and indirect optimization (PIP-Indirect). Comparing to traditional fine-tuning methods for this task, PIP is a compute-efficient alternative with 10 times less learnable parameters. Comparing to existing prefix-tuning methods, PIP excels at capturing syntax control information, achieving significantly higher performance at the same level of learnable parameter count.
Anthology ID:
2023.findings-acl.659
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10372–10380
Language:
URL:
https://aclanthology.org/2023.findings-acl.659
DOI:
10.18653/v1/2023.findings-acl.659
Bibkey:
Cite (ACL):
Yixin Wan, Kuan-Hao Huang, and Kai-Wei Chang. 2023. PIP: Parse-Instructed Prefix for Syntactically Controlled Paraphrase Generation. In Findings of the Association for Computational Linguistics: ACL 2023, pages 10372–10380, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
PIP: Parse-Instructed Prefix for Syntactically Controlled Paraphrase Generation (Wan et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/2023.findings-acl.659.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-5/2023.findings-acl.659.mp4