Convolutional Neural Network for Universal Sentence Embeddings

Xiaoqi Jiao, Fang Wang, Dan Feng


Abstract
This paper proposes a simple CNN model for creating general-purpose sentence embeddings that can transfer easily across domains and can also act as effective initialization for downstream tasks. Recently, averaging the embeddings of words in a sentence has proven to be a surprisingly successful and efficient way of obtaining sentence embeddings. However, these models represent a sentence, only in terms of features of words or uni-grams in it. In contrast, our model (CSE) utilizes both features of words and n-grams to encode sentences, which is actually a generalization of these bag-of-words models. The extensive experiments demonstrate that CSE performs better than average models in transfer learning setting and exceeds the state of the art in supervised learning setting by initializing the parameters with the pre-trained sentence embeddings.
Anthology ID:
C18-1209
Volume:
Proceedings of the 27th International Conference on Computational Linguistics
Month:
August
Year:
2018
Address:
Santa Fe, New Mexico, USA
Editors:
Emily M. Bender, Leon Derczynski, Pierre Isabelle
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2470–2481
Language:
URL:
https://aclanthology.org/C18-1209
DOI:
Bibkey:
Cite (ACL):
Xiaoqi Jiao, Fang Wang, and Dan Feng. 2018. Convolutional Neural Network for Universal Sentence Embeddings. In Proceedings of the 27th International Conference on Computational Linguistics, pages 2470–2481, Santa Fe, New Mexico, USA. Association for Computational Linguistics.
Cite (Informal):
Convolutional Neural Network for Universal Sentence Embeddings (Jiao et al., COLING 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/C18-1209.pdf