Jiaming Wu
2022
Improving Controllable Text Generation with Position-Aware Weighted Decoding
Yuxuan Gu
|
Xiaocheng Feng
|
Sicheng Ma
|
Jiaming Wu
|
Heng Gong
|
Bing Qin
Findings of the Association for Computational Linguistics: ACL 2022
Weighted decoding methods composed of the pretrained language model (LM) and the controller have achieved promising results for controllable text generation. However, these models often suffer from a control strength/fluency trade-off problem as higher control strength is more likely to generate incoherent and repetitive text. In this paper, we illustrate this trade-off is arisen by the controller imposing the target attribute on the LM at improper positions. And we propose a novel framework based on existing weighted decoding methods called CAT-PAW, which introduces a lightweight regulator to adjust bias signals from the controller at different decoding positions. Experiments on positive sentiment control, topic control, and language detoxification show the effectiveness of our CAT-PAW upon 4 SOTA models.
Search