Decomposable Neural Paraphrase Generation

Zichao Li, Xin Jiang, Lifeng Shang, Qun Liu


Abstract
Paraphrasing exists at different granularity levels, such as lexical level, phrasal level and sentential level. This paper presents Decomposable Neural Paraphrase Generator (DNPG), a Transformer-based model that can learn and generate paraphrases of a sentence at different levels of granularity in a disentangled way. Specifically, the model is composed of multiple encoders and decoders with different structures, each of which corresponds to a specific granularity. The empirical study shows that the decomposition mechanism of DNPG makes paraphrase generation more interpretable and controllable. Based on DNPG, we further develop an unsupervised domain adaptation method for paraphrase generation. Experimental results show that the proposed model achieves competitive in-domain performance compared to state-of-the-art neural models, and significantly better performance when adapting to a new domain.
Anthology ID:
P19-1332
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3403–3414
Language:
URL:
https://aclanthology.org/P19-1332
DOI:
10.18653/v1/P19-1332
Bibkey:
Cite (ACL):
Zichao Li, Xin Jiang, Lifeng Shang, and Qun Liu. 2019. Decomposable Neural Paraphrase Generation. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 3403–3414, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Decomposable Neural Paraphrase Generation (Li et al., ACL 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-1/P19-1332.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-1/P19-1332.mp4