Narrative Text Generation with a Latent Discrete Plan

Harsh Jhamtani, Taylor Berg-Kirkpatrick


Abstract
Past work on story generation has demonstrated the usefulness of conditioning on a generation plan to generate coherent stories. However, these approaches have used heuristics or off-the-shelf models to first tag training stories with the desired type of plan, and then train generation models in a supervised fashion. In this paper, we propose a deep latent variable model that first samples a sequence of anchor words, one per sentence in the story, as part of its generative process. During training, our model treats the sequence of anchor words as a latent variable and attempts to induce anchoring sequences that help guide generation in an unsupervised fashion. We conduct experiments with several types of sentence decoder distributions – left-to-right and non-monotonic, with different degrees of restriction. Further, since we use amortized variational inference to train our model, we introduce two corresponding types of inference network for predicting the posterior on anchor words. We conduct human evaluations which demonstrate that the stories produced by our model are rated better in comparison with baselines which do not consider story plans, and are similar or better in quality relative to baselines which use external supervision for plans. Additionally, the proposed model gets favorable scores when evaluated on perplexity, diversity, and control of story via discrete plan
Anthology ID:
2020.findings-emnlp.325
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3637–3650
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.325
DOI:
10.18653/v1/2020.findings-emnlp.325
Bibkey:
Cite (ACL):
Harsh Jhamtani and Taylor Berg-Kirkpatrick. 2020. Narrative Text Generation with a Latent Discrete Plan. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 3637–3650, Online. Association for Computational Linguistics.
Cite (Informal):
Narrative Text Generation with a Latent Discrete Plan (Jhamtani & Berg-Kirkpatrick, Findings 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/proper-vol2-ingestion/2020.findings-emnlp.325.pdf
Code
 harsh19/Latent-Anchor-Plan