Ryota Miyano


2023

pdf
Self-Ensemble of N-best Generation Hypotheses by Lexically Constrained Decoding
Ryota Miyano | Tomoyuki Kajiwara | Yuki Arase
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing

We propose a method that ensembles N-best hypotheses to improve natural language generation. Previous studies have achieved notable improvements in generation quality by explicitly reranking N-best candidates. These studies assume that there exists a hypothesis of higher quality. We expand the assumption to be more practical as there exist partly higher quality hypotheses in the N-best yet they may be imperfect as the entire sentences. By merging these high-quality fragments, we can obtain a higher-quality output than the single-best sentence. Specifically, we first obtain N-best hypotheses and conduct token-level quality estimation. We then apply tokens that should or should not be present in the final output as lexical constraints in decoding. Empirical experiments on paraphrase generation, summarisation, and constrained text generation confirm that our method outperforms the strong N-best reranking methods.