Enhancing Sentence Embedding with Generalized Pooling

Qian Chen, Zhen-Hua Ling, Xiaodan Zhu


Abstract
Pooling is an essential component of a wide variety of sentence representation and embedding models. This paper explores generalized pooling methods to enhance sentence embedding. We propose vector-based multi-head attention that includes the widely used max pooling, mean pooling, and scalar self-attention as special cases. The model benefits from properly designed penalization terms to reduce redundancy in multi-head attention. We evaluate the proposed model on three different tasks: natural language inference (NLI), author profiling, and sentiment classification. The experiments show that the proposed model achieves significant improvement over strong sentence-encoding-based methods, resulting in state-of-the-art performances on four datasets. The proposed approach can be easily implemented for more problems than we discuss in this paper.
Anthology ID:
C18-1154
Volume:
Proceedings of the 27th International Conference on Computational Linguistics
Month:
August
Year:
2018
Address:
Santa Fe, New Mexico, USA
Editors:
Emily M. Bender, Leon Derczynski, Pierre Isabelle
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1815–1826
Language:
URL:
https://aclanthology.org/C18-1154
DOI:
Bibkey:
Cite (ACL):
Qian Chen, Zhen-Hua Ling, and Xiaodan Zhu. 2018. Enhancing Sentence Embedding with Generalized Pooling. In Proceedings of the 27th International Conference on Computational Linguistics, pages 1815–1826, Santa Fe, New Mexico, USA. Association for Computational Linguistics.
Cite (Informal):
Enhancing Sentence Embedding with Generalized Pooling (Chen et al., COLING 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/fix-dup-bibkey/C18-1154.pdf
Code
 lukecq1231/generalized-pooling
Data
MultiNLISNLIYelp