Plan-CVAE: A Planning-based Conditional Variational Autoencoder for Story Generation

Lin Wang, Juntao Li, Rui Yan, Dongyan Zhao


Abstract
Story generation is a challenging task of automatically creating natural languages to describe a sequence of events, which requires outputting text with not only a consistent topic but also novel wordings. Although many approaches have been proposed and obvious progress has been made on this task, there is still a large room for improvement, especially for improving thematic consistency and wording diversity. To mitigate the gap between generated stories and those written by human writers, in this paper, we propose a planning-based conditional variational autoencoder, namely Plan-CVAE, which first plans a keyword sequence and then generates a story based on the keyword sequence. In our method, the keywords planning strategy is used to improve thematic consistency while the CVAE module allows enhancing wording diversity. Experimental results on a benchmark dataset confirm that our proposed method can generate stories with both thematic consistency and wording novelty, and outperforms state-of-the-art methods on both automatic metrics and human evaluations.
Anthology ID:
2020.ccl-1.83
Volume:
Proceedings of the 19th Chinese National Conference on Computational Linguistics
Month:
October
Year:
2020
Address:
Haikou, China
Editors:
Maosong Sun (孙茂松), Sujian Li (李素建), Yue Zhang (张岳), Yang Liu (刘洋)
Venue:
CCL
SIG:
Publisher:
Chinese Information Processing Society of China
Note:
Pages:
892–902
Language:
English
URL:
https://aclanthology.org/2020.ccl-1.83
DOI:
Bibkey:
Cite (ACL):
Lin Wang, Juntao Li, Rui Yan, and Dongyan Zhao. 2020. Plan-CVAE: A Planning-based Conditional Variational Autoencoder for Story Generation. In Proceedings of the 19th Chinese National Conference on Computational Linguistics, pages 892–902, Haikou, China. Chinese Information Processing Society of China.
Cite (Informal):
Plan-CVAE: A Planning-based Conditional Variational Autoencoder for Story Generation (Wang et al., CCL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2020.ccl-1.83.pdf
Data
ROCStories