Controlled Text Generation Using Dictionary Prior in Variational Autoencoders

Xianghong Fang, Jian Li, Lifeng Shang, Xin Jiang, Qun Liu, Dit-Yan Yeung


Abstract
While variational autoencoders (VAEs) have been widely applied in text generation tasks, they are troubled by two challenges: insufficient representation capacity and poor controllability. The former results from the posterior collapse and restrictive assumption, which impede better representation learning. The latter arises as continuous latent variables in traditional formulations hinder VAEs from interpretability and controllability. In this paper, we propose Dictionary Prior (DPrior), a new data-driven prior that enjoys the merits of expressivity and controllability. To facilitate controlled text generation with DPrior, we propose to employ contrastive learning to separate the latent space into several parts. Extensive experiments on both language modeling and controlled text generation demonstrate the effectiveness of the proposed approach.
Anthology ID:
2022.findings-acl.10
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
97–111
Language:
URL:
https://aclanthology.org/2022.findings-acl.10
DOI:
10.18653/v1/2022.findings-acl.10
Bibkey:
Cite (ACL):
Xianghong Fang, Jian Li, Lifeng Shang, Xin Jiang, Qun Liu, and Dit-Yan Yeung. 2022. Controlled Text Generation Using Dictionary Prior in Variational Autoencoders. In Findings of the Association for Computational Linguistics: ACL 2022, pages 97–111, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Controlled Text Generation Using Dictionary Prior in Variational Autoencoders (Fang et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/2022.findings-acl.10.pdf
Data
SNLI