Controllable Active-Passive Voice Generation using Prefix Tuning

Valentin Knappich, Timo Pierre Schrader


Abstract
The prompting paradigm is an uprising trend in the field of Natural Language Processing (NLP) that aims to learn tasks by finding appropriate prompts rather than fine-tuning the model weights. Such prompts can express an intention, e.g., they can instruct a language model to generate a summary of a given event. In this paper, we study how to influence (”control”) the language generation process such that the outcome fulfills a requested linguistic property. More specifically, we look at controllable active-passive (AP) voice generation, i.e., we require the model to generate a sentence in the requested voice. We build upon the prefix tuning approach and introduce control tokens that are trained on controllable AP generation. We create an AP subset of the WebNLG dataset to fine-tune these control tokens. Among four different models, the one trained with a contrastive learning approach yields the best results in terms of AP accuracy ( 95%) but at the cost of decreased performance on the original WebNLG task.
Anthology ID:
2023.ranlp-stud.3
Volume:
Proceedings of the 8th Student Research Workshop associated with the International Conference Recent Advances in Natural Language Processing
Month:
September
Year:
2023
Address:
Varna, Bulgaria
Editors:
Momchil Hardalov, Zara Kancheva, Boris Velichkov, Ivelina Nikolova-Koleva, Milena Slavcheva
Venue:
RANLP
SIG:
Publisher:
INCOMA Ltd., Shoumen, Bulgaria
Note:
Pages:
23–32
Language:
URL:
https://aclanthology.org/2023.ranlp-stud.3
DOI:
Bibkey:
Cite (ACL):
Valentin Knappich and Timo Pierre Schrader. 2023. Controllable Active-Passive Voice Generation using Prefix Tuning. In Proceedings of the 8th Student Research Workshop associated with the International Conference Recent Advances in Natural Language Processing, pages 23–32, Varna, Bulgaria. INCOMA Ltd., Shoumen, Bulgaria.
Cite (Informal):
Controllable Active-Passive Voice Generation using Prefix Tuning (Knappich & Schrader, RANLP 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2023.ranlp-stud.3.pdf