Exploring Non-Autoregressive Text Style Transfer

Yun Ma, Qing Li


Abstract
In this paper, we explore Non-AutoRegressive (NAR) decoding for unsupervised text style transfer. We first propose a base NAR model by directly adapting the common training scheme from its AutoRegressive (AR) counterpart. Despite the faster inference speed over the AR model, this NAR model sacrifices its transfer performance due to the lack of conditional dependence between output tokens. To this end, we investigate three techniques, i.e., knowledge distillation, contrastive learning, and iterative decoding, for performance enhancement. Experimental results on two benchmark datasets suggest that, although the base NAR model is generally inferior to AR decoding, their performance gap can be clearly narrowed when empowering NAR decoding with knowledge distillation, contrastive learning, and iterative decoding.
Anthology ID:
2021.emnlp-main.730
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9267–9278
Language:
URL:
https://aclanthology.org/2021.emnlp-main.730
DOI:
10.18653/v1/2021.emnlp-main.730
Bibkey:
Cite (ACL):
Yun Ma and Qing Li. 2021. Exploring Non-Autoregressive Text Style Transfer. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 9267–9278, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Exploring Non-Autoregressive Text Style Transfer (Ma & Li, EMNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/improve-issue-templates/2021.emnlp-main.730.pdf
Video:
 https://preview.aclanthology.org/improve-issue-templates/2021.emnlp-main.730.mp4
Code
 sunlight-ym/nar_style_transfer
Data
GYAFC