@inproceedings{orbach-goldberg-2020-facts2story,
    title = "{F}acts2{S}tory: Controlling Text Generation by Key Facts",
    author = "Orbach, Eyal  and
      Goldberg, Yoav",
    editor = "Scott, Donia  and
      Bel, Nuria  and
      Zong, Chengqing",
    booktitle = "Proceedings of the 28th International Conference on Computational Linguistics",
    month = dec,
    year = "2020",
    address = "Barcelona, Spain (Online)",
    publisher = "International Committee on Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2020.coling-main.211/",
    doi = "10.18653/v1/2020.coling-main.211",
    pages = "2329--2345",
    abstract = "Recent advancements in self-attention neural network architectures have raised the bar for open-ended text generation. Yet, while current methods are capable of producing a coherent text which is several hundred words long, attaining control over the content that is being generated{---}as well as evaluating it{---}are still open questions. We propose a controlled generation task which is based on expanding a sequence of facts, expressed in natural language, into a longer narrative. We introduce human-based evaluation metrics for this task, as well as a method for deriving a large training dataset. We evaluate three methods on this task, based on fine-tuning pre-trained models. We show that while auto-regressive, unidirectional Language Models such as GPT2 produce better fluency, they struggle to adhere to the requested facts. We propose a plan-and-cloze model (using fine-tuned XLNet) which produces competitive fluency while adhering to the requested content."
}Markdown (Informal)
[Facts2Story: Controlling Text Generation by Key Facts](https://preview.aclanthology.org/ingest-emnlp/2020.coling-main.211/) (Orbach & Goldberg, COLING 2020)
ACL
- Eyal Orbach and Yoav Goldberg. 2020. Facts2Story: Controlling Text Generation by Key Facts. In Proceedings of the 28th International Conference on Computational Linguistics, pages 2329–2345, Barcelona, Spain (Online). International Committee on Computational Linguistics.