Abstract
We explore story generation: creative systems that can build coherent and fluent passages of text about a topic. We collect a large dataset of 300K human-written stories paired with writing prompts from an online forum. Our dataset enables hierarchical story generation, where the model first generates a premise, and then transforms it into a passage of text. We gain further improvements with a novel form of model fusion that improves the relevance of the story to the prompt, and adding a new gated multi-scale self-attention mechanism to model long-range context. Experiments show large improvements over strong baselines on both automated and human evaluations. Human judges prefer stories generated by our approach to those from a strong non-hierarchical model by a factor of two to one.- Anthology ID:
 - P18-1082
 - Volume:
 - Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
 - Month:
 - July
 - Year:
 - 2018
 - Address:
 - Melbourne, Australia
 - Editors:
 - Iryna Gurevych, Yusuke Miyao
 - Venue:
 - ACL
 - SIG:
 - Publisher:
 - Association for Computational Linguistics
 - Note:
 - Pages:
 - 889–898
 - Language:
 - URL:
 - https://aclanthology.org/P18-1082
 - DOI:
 - 10.18653/v1/P18-1082
 - Cite (ACL):
 - Angela Fan, Mike Lewis, and Yann Dauphin. 2018. Hierarchical Neural Story Generation. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 889–898, Melbourne, Australia. Association for Computational Linguistics.
 - Cite (Informal):
 - Hierarchical Neural Story Generation (Fan et al., ACL 2018)
 - PDF:
 - https://preview.aclanthology.org/ingest-acl-2023-videos/P18-1082.pdf
 - Code
 - pytorch/fairseq + additional community code
 - Data
 - WritingPrompts