Semi-supervised Formality Style Transfer using Language Model Discriminator and Mutual Information Maximization

Kunal Chawla, Diyi Yang


Abstract
Formality style transfer is the task of converting informal sentences to grammatically-correct formal sentences, which can be used to improve performance of many downstream NLP tasks. In this work, we propose a semi-supervised formality style transfer model that utilizes a language model-based discriminator to maximize the likelihood of the output sentence being formal, which allows us to use maximization of token-level conditional probabilities for training. We further propose to maximize mutual information between source and target styles as our training objective instead of maximizing the regular likelihood that often leads to repetitive and trivial generated responses. Experiments showed that our model outperformed previous state-of-the-art baselines significantly in terms of both automated metrics and human judgement. We further generalized our model to unsupervised text style transfer task, and achieved significant improvements on two benchmark sentiment style transfer datasets.
Anthology ID:
2020.findings-emnlp.212
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2340–2354
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.212
DOI:
10.18653/v1/2020.findings-emnlp.212
Bibkey:
Cite (ACL):
Kunal Chawla and Diyi Yang. 2020. Semi-supervised Formality Style Transfer using Language Model Discriminator and Mutual Information Maximization. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 2340–2354, Online. Association for Computational Linguistics.
Cite (Informal):
Semi-supervised Formality Style Transfer using Language Model Discriminator and Mutual Information Maximization (Chawla & Yang, Findings 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/2020.findings-emnlp.212.pdf
Optional supplementary material:
 2020.findings-emnlp.212.OptionalSupplementaryMaterial.zip
Video:
 https://slideslive.com/38940140
Code
 GT-SALT/FormalityStyleTransfer
Data
BookCorpus