A Discriminative Latent-Variable Model for Bilingual Lexicon Induction

Sebastian Ruder, Ryan Cotterell, Yova Kementchedjhieva, Anders Søgaard


Abstract
We introduce a novel discriminative latent-variable model for the task of bilingual lexicon induction. Our model combines the bipartite matching dictionary prior of Haghighi et al. (2008) with a state-of-the-art embedding-based approach. To train the model, we derive an efficient Viterbi EM algorithm. We provide empirical improvements on six language pairs under two metrics and show that the prior theoretically and empirically helps to mitigate the hubness problem. We also demonstrate how previous work may be viewed as a similarly fashioned latent-variable model, albeit with a different prior.
Anthology ID:
D18-1042
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
458–468
Language:
URL:
https://aclanthology.org/D18-1042
DOI:
10.18653/v1/D18-1042
Bibkey:
Cite (ACL):
Sebastian Ruder, Ryan Cotterell, Yova Kementchedjhieva, and Anders Søgaard. 2018. A Discriminative Latent-Variable Model for Bilingual Lexicon Induction. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 458–468, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
A Discriminative Latent-Variable Model for Bilingual Lexicon Induction (Ruder et al., EMNLP 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-3/D18-1042.pdf
Code
 sebastianruder/latent-variable-vecmap