Guided Neural Language Generation for Automated Storytelling
Prithviraj Ammanabrolu, Ethan Tien, Wesley Cheung, Zhaochen Luo, William Ma, Lara J. Martin, Mark o. Riedl
Abstract
Neural network based approaches to automated story plot generation attempt to learn how to generate novel plots from a corpus of natural language plot summaries. Prior work has shown that a semantic abstraction of sentences called events improves neural plot generation and and allows one to decompose the problem into: (1) the generation of a sequence of events (event-to-event) and (2) the transformation of these events into natural language sentences (event-to-sentence). However, typical neural language generation approaches to event-to-sentence can ignore the event details and produce grammatically-correct but semantically-unrelated sentences. We present an ensemble-based model that generates natural language guided by events. Our method outperforms the baseline sequence-to-sequence model. Additionally, we provide results for a full end-to-end automated story generation system, demonstrating how our model works with existing systems designed for the event-to-event problem.- Anthology ID:
- W19-3405
- Volume:
- Proceedings of the Second Workshop on Storytelling
- Month:
- August
- Year:
- 2019
- Address:
- Florence, Italy
- Editors:
- Francis Ferraro, Ting-Hao ‘Kenneth’ Huang, Stephanie M. Lukin, Margaret Mitchell
- Venue:
- Story-NLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 46–55
- Language:
- URL:
- https://aclanthology.org/W19-3405
- DOI:
- 10.18653/v1/W19-3405
- Cite (ACL):
- Prithviraj Ammanabrolu, Ethan Tien, Wesley Cheung, Zhaochen Luo, William Ma, Lara J. Martin, and Mark o. Riedl. 2019. Guided Neural Language Generation for Automated Storytelling. In Proceedings of the Second Workshop on Storytelling, pages 46–55, Florence, Italy. Association for Computational Linguistics.
- Cite (Informal):
- Guided Neural Language Generation for Automated Storytelling (Ammanabrolu et al., Story-NLP 2019)
- PDF:
- https://preview.aclanthology.org/fix-dup-bibkey/W19-3405.pdf