Mao Isogawa


2020

pdf
Tiny Word Embeddings Using Globally Informed Reconstruction
Sora Ohashi | Mao Isogawa | Tomoyuki Kajiwara | Yuki Arase
Proceedings of the 28th International Conference on Computational Linguistics

We reduce the model size of pre-trained word embeddings by a factor of 200 while preserving its quality. Previous studies in this direction created a smaller word embedding model by reconstructing pre-trained word representations from those of subwords, which allows to store only a smaller number of subword embeddings in the memory. However, previous studies that train the reconstruction models using only target words cannot reduce the model size extremely while preserving its quality. Inspired by the observation of words with similar meanings having similar embeddings, our reconstruction training learns the global relationships among words, which can be employed in various models for word embedding reconstruction. Experimental results on word similarity benchmarks show that the proposed method improves the performance of the all subword-based reconstruction models.