Transductive Learning for Unsupervised Text Style Transfer

Fei Xiao, Liang Pang, Yanyan Lan, Yan Wang, Huawei Shen, Xueqi Cheng


Abstract
Unsupervised style transfer models are mainly based on an inductive learning approach, which represents the style as embeddings, decoder parameters, or discriminator parameters and directly applies these general rules to the test cases. However, the lacking of parallel corpus hinders the ability of these inductive learning methods on this task. As a result, it is likely to cause severe inconsistent style expressions, like ‘the salad is rude’. To tackle this problem, we propose a novel transductive learning approach in this paper, based on a retrieval-based context-aware style representation. Specifically, an attentional encoder-decoder with a retriever framework is utilized. It involves top-K relevant sentences in the target style in the transfer process. In this way, we can learn a context-aware style embedding to alleviate the above inconsistency problem. In this paper, both sparse (BM25) and dense retrieval functions (MIPS) are used, and two objective functions are designed to facilitate joint learning. Experimental results show that our method outperforms several strong baselines. The proposed transductive learning approach is general and effective to the task of unsupervised style transfer, and we will apply it to the other two typical methods in the future.
Anthology ID:
2021.emnlp-main.195
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2510–2521
Language:
URL:
https://aclanthology.org/2021.emnlp-main.195
DOI:
10.18653/v1/2021.emnlp-main.195
Bibkey:
Cite (ACL):
Fei Xiao, Liang Pang, Yanyan Lan, Yan Wang, Huawei Shen, and Xueqi Cheng. 2021. Transductive Learning for Unsupervised Text Style Transfer. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 2510–2521, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Transductive Learning for Unsupervised Text Style Transfer (Xiao et al., EMNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2021.emnlp-main.195.pdf
Software:
 2021.emnlp-main.195.Software.zip
Video:
 https://preview.aclanthology.org/ingestion-script-update/2021.emnlp-main.195.mp4
Code
 xiaofei05/tsst
Data
GYAFC