Fine-grained Text Style Transfer with Diffusion-Based Language Models

Yiwei Lyu, Tiange Luo, Jiacheng Shi, Todd Hollon, Honglak Lee


Abstract
Diffusion probabilistic models have shown great success in generating high-quality images controllably, and researchers have tried to utilize this controllability into text generation domain. Previous works on diffusion-based language models have shown that they can be trained without external knowledge (such as pre-trained weights) and still achieve stable performance and controllability. In this paper, we trained a diffusion-based model on StylePTB dataset, the standard benchmark for fine-grained text style transfers. The tasks in StylePTB requires much more refined control over the output text compared to tasks evaluated in previous works, and our model was able to achieve state-of-the-art performance on StylePTB on both individual and compositional transfers. Moreover, our model, trained on limited data from StylePTB without external knowledge, outperforms previous works that utilized pretrained weights, embeddings, and external grammar parsers, and this may indicate that diffusion-based language models have great potential under low-resource settings.
Anthology ID:
2023.repl4nlp-1.6
Volume:
Proceedings of the 8th Workshop on Representation Learning for NLP (RepL4NLP 2023)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Burcu Can, Maximilian Mozes, Samuel Cahyawijaya, Naomi Saphra, Nora Kassner, Shauli Ravfogel, Abhilasha Ravichander, Chen Zhao, Isabelle Augenstein, Anna Rogers, Kyunghyun Cho, Edward Grefenstette, Lena Voita
Venue:
RepL4NLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
65–74
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/2023.repl4nlp-1.6/
DOI:
10.18653/v1/2023.repl4nlp-1.6
Bibkey:
Cite (ACL):
Yiwei Lyu, Tiange Luo, Jiacheng Shi, Todd Hollon, and Honglak Lee. 2023. Fine-grained Text Style Transfer with Diffusion-Based Language Models. In Proceedings of the 8th Workshop on Representation Learning for NLP (RepL4NLP 2023), pages 65–74, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Fine-grained Text Style Transfer with Diffusion-Based Language Models (Lyu et al., RepL4NLP 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/2023.repl4nlp-1.6.pdf