Abstract
We follow the step-by-step approach to neural data-to-text generation proposed by Moryossef et al (2019), in which the generation process is divided into a text planning stage followed by a plan realization stage. We suggest four extensions to that framework: (1) we introduce a trainable neural planning component that can generate effective plans several orders of magnitude faster than the original planner; (2) we incorporate typing hints that improve the model’s ability to deal with unseen relations and entities; (3) we introduce a verification-by-reranking stage that substantially improves the faithfulness of the resulting texts; (4) we incorporate a simple but effective referring expression generation module. These extensions result in a generation process that is faster, more fluent, and more accurate.- Anthology ID:
- W19-8645
- Volume:
- Proceedings of the 12th International Conference on Natural Language Generation
- Month:
- October–November
- Year:
- 2019
- Address:
- Tokyo, Japan
- Editors:
- Kees van Deemter, Chenghua Lin, Hiroya Takamura
- Venue:
- INLG
- SIG:
- SIGGEN
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 377–382
- Language:
- URL:
- https://aclanthology.org/W19-8645
- DOI:
- 10.18653/v1/W19-8645
- Cite (ACL):
- Amit Moryossef, Yoav Goldberg, and Ido Dagan. 2019. Improving Quality and Efficiency in Plan-based Neural Data-to-text Generation. In Proceedings of the 12th International Conference on Natural Language Generation, pages 377–382, Tokyo, Japan. Association for Computational Linguistics.
- Cite (Informal):
- Improving Quality and Efficiency in Plan-based Neural Data-to-text Generation (Moryossef et al., INLG 2019)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-2/W19-8645.pdf