Abstract
While one of the first steps in many NLP systems is selecting what pre-trained word embeddings to use, we argue that such a step is better left for neural networks to figure out by themselves. To that end, we introduce dynamic meta-embeddings, a simple yet effective method for the supervised learning of embedding ensembles, which leads to state-of-the-art performance within the same model class on a variety of tasks. We subsequently show how the technique can be used to shed new light on the usage of word embeddings in NLP systems.- Anthology ID:
- D18-1176
- Volume:
- Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
- Month:
- October-November
- Year:
- 2018
- Address:
- Brussels, Belgium
- Editors:
- Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
- Venue:
- EMNLP
- SIG:
- SIGDAT
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 1466–1477
- Language:
- URL:
- https://aclanthology.org/D18-1176
- DOI:
- 10.18653/v1/D18-1176
- Cite (ACL):
- Douwe Kiela, Changhan Wang, and Kyunghyun Cho. 2018. Dynamic Meta-Embeddings for Improved Sentence Representations. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 1466–1477, Brussels, Belgium. Association for Computational Linguistics.
- Cite (Informal):
- Dynamic Meta-Embeddings for Improved Sentence Representations (Kiela et al., EMNLP 2018)
- PDF:
- https://preview.aclanthology.org/improve-issue-templates/D18-1176.pdf
- Code
- facebookresearch/DME + additional community code
- Data
- MultiNLI, SNLI, SST