Bilingual Lexicon Induction by Learning to Combine Word-Level and Character-Level Representations

Geert Heyman, Ivan Vulić, Marie-Francine Moens


Abstract
We study the problem of bilingual lexicon induction (BLI) in a setting where some translation resources are available, but unknown translations are sought for certain, possibly domain-specific terminology. We frame BLI as a classification problem for which we design a neural network based classification architecture composed of recurrent long short-term memory and deep feed forward networks. The results show that word- and character-level representations each improve state-of-the-art results for BLI, and the best results are obtained by exploiting the synergy between these word- and character-level representations in the classification model.
Anthology ID:
E17-1102
Volume:
Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 1, Long Papers
Month:
April
Year:
2017
Address:
Valencia, Spain
Editors:
Mirella Lapata, Phil Blunsom, Alexander Koller
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1085–1095
Language:
URL:
https://aclanthology.org/E17-1102
DOI:
Bibkey:
Cite (ACL):
Geert Heyman, Ivan Vulić, and Marie-Francine Moens. 2017. Bilingual Lexicon Induction by Learning to Combine Word-Level and Character-Level Representations. In Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 1, Long Papers, pages 1085–1095, Valencia, Spain. Association for Computational Linguistics.
Cite (Informal):
Bilingual Lexicon Induction by Learning to Combine Word-Level and Character-Level Representations (Heyman et al., EACL 2017)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp22-frontmatter/E17-1102.pdf