Guiding Neural Story Generation with Reader Models

Xiangyu Peng, Kaige Xie, Amal Alabdulkarim, Harshith Kayam, Samihan Dani, Mark Riedl


Abstract
Automated storytelling has long captured the attention of researchers for the ubiquity of narratives in everyday life. However, it is challenging to maintain coherence and stay on-topictoward a specific ending when generating narratives with neural language models. In this paper, we introduce Story generation with ReaderModels (StoRM), a framework in which areader model is used to reason about the storyshould progress. A reader model infers whata human reader believes about the concepts,entities, and relations about the fictional storyworld. We show how an explicit reader modelrepresented as a knowledge graph affords the storycoherence and provides controllability in theform of achieving a given story world stategoal. Experiments show that our model produces significantly more coherent and on-topicstories, outperforming baselines in dimensionsincluding plot plausibility and staying on topic
Anthology ID:
2022.findings-emnlp.526
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7087–7111
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.526
DOI:
10.18653/v1/2022.findings-emnlp.526
Bibkey:
Cite (ACL):
Xiangyu Peng, Kaige Xie, Amal Alabdulkarim, Harshith Kayam, Samihan Dani, and Mark Riedl. 2022. Guiding Neural Story Generation with Reader Models. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 7087–7111, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Guiding Neural Story Generation with Reader Models (Peng et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/2022.findings-emnlp.526.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-5/2022.findings-emnlp.526.mp4