Frustratingly Easy Model Ensemble for Abstractive Summarization

Hayato Kobayashi


Abstract
Ensemble methods, which combine multiple models at decoding time, are now widely known to be effective for text-generation tasks. However, they generally increase computational costs, and thus, there have been many studies on compressing or distilling ensemble models. In this paper, we propose an alternative, simple but effective unsupervised ensemble method, post-ensemble, that combines multiple models by selecting a majority-like output in post-processing. We theoretically prove that our method is closely related to kernel density estimation based on the von Mises-Fisher kernel. Experimental results on a news-headline-generation task show that the proposed method performs better than the current ensemble methods.
Anthology ID:
D18-1449
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
4165–4176
Language:
URL:
https://aclanthology.org/D18-1449
DOI:
10.18653/v1/D18-1449
Bibkey:
Cite (ACL):
Hayato Kobayashi. 2018. Frustratingly Easy Model Ensemble for Abstractive Summarization. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 4165–4176, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Frustratingly Easy Model Ensemble for Abstractive Summarization (Kobayashi, EMNLP 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-3/D18-1449.pdf
Attachment:
 D18-1449.Attachment.pdf
Poster:
 D18-1449.Poster.pdf