Attentive Mimicking: Better Word Embeddings by Attending to Informative Contexts

Timo Schick, Hinrich Schütze


Abstract
Learning high-quality embeddings for rare words is a hard problem because of sparse context information. Mimicking (Pinter et al., 2017) has been proposed as a solution: given embeddings learned by a standard algorithm, a model is first trained to reproduce embeddings of frequent words from their surface form and then used to compute embeddings for rare words. In this paper, we introduce attentive mimicking: the mimicking model is given access not only to a word’s surface form, but also to all available contexts and learns to attend to the most informative and reliable contexts for computing an embedding. In an evaluation on four tasks, we show that attentive mimicking outperforms previous work for both rare and medium-frequency words. Thus, compared to previous work, attentive mimicking improves embeddings for a much larger part of the vocabulary, including the medium-frequency range.
Anthology ID:
N19-1048
Volume:
Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)
Month:
June
Year:
2019
Address:
Minneapolis, Minnesota
Editors:
Jill Burstein, Christy Doran, Thamar Solorio
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
489–494
Language:
URL:
https://aclanthology.org/N19-1048
DOI:
10.18653/v1/N19-1048
Bibkey:
Cite (ACL):
Timo Schick and Hinrich Schütze. 2019. Attentive Mimicking: Better Word Embeddings by Attending to Informative Contexts. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages 489–494, Minneapolis, Minnesota. Association for Computational Linguistics.
Cite (Informal):
Attentive Mimicking: Better Word Embeddings by Attending to Informative Contexts (Schick & Schütze, NAACL 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/add_acl24_videos/N19-1048.pdf
Supplementary:
 N19-1048.Supplementary.pdf
Code
 timoschick/form-context-model