A Good Sample is Hard to Find: Noise Injection Sampling and Self-Training for Neural Language Generation Models

Chris Kedzie, Kathleen McKeown


Abstract
Deep neural networks (DNN) are quickly becoming the de facto standard modeling method for many natural language generation (NLG) tasks. In order for such models to truly be useful, they must be capable of correctly generating utterances for novel meaning representations (MRs) at test time. In practice, even sophisticated DNNs with various forms of semantic control frequently fail to generate utterances faithful to the input MR. In this paper, we propose an architecture agnostic self-training method to sample novel MR/text utterance pairs to augment the original training data. Remarkably, after training on the augmented data, even simple encoder-decoder models with greedy decoding are capable of generating semantically correct utterances that are as good as state-of-the-art outputs in both automatic and human evaluations of quality.
Anthology ID:
W19-8672
Volume:
Proceedings of the 12th International Conference on Natural Language Generation
Month:
October–November
Year:
2019
Address:
Tokyo, Japan
Editors:
Kees van Deemter, Chenghua Lin, Hiroya Takamura
Venue:
INLG
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
584–593
Language:
URL:
https://aclanthology.org/W19-8672
DOI:
10.18653/v1/W19-8672
Bibkey:
Cite (ACL):
Chris Kedzie and Kathleen McKeown. 2019. A Good Sample is Hard to Find: Noise Injection Sampling and Self-Training for Neural Language Generation Models. In Proceedings of the 12th International Conference on Natural Language Generation, pages 584–593, Tokyo, Japan. Association for Computational Linguistics.
Cite (Informal):
A Good Sample is Hard to Find: Noise Injection Sampling and Self-Training for Neural Language Generation Models (Kedzie & McKeown, INLG 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-2024-clasp/W19-8672.pdf
Supplementary attachment:
 W19-8672.Supplementary_Attachment.pdf
Code
 kedz/noiseylg