Tailoring Neural Architectures for Translating from Morphologically Rich Languages

Peyman Passban, Andy Way, Qun Liu


Abstract
A morphologically complex word (MCW) is a hierarchical constituent with meaning-preserving subunits, so word-based models which rely on surface forms might not be powerful enough to translate such structures. When translating from morphologically rich languages (MRLs), a source word could be mapped to several words or even a full sentence on the target side, which means an MCW should not be treated as an atomic unit. In order to provide better translations for MRLs, we boost the existing neural machine translation (NMT) architecture with a double- channel encoder and a double-attentive decoder. The main goal targeted in this research is to provide richer information on the encoder side and redesign the decoder accordingly to benefit from such information. Our experimental results demonstrate that we could achieve our goal as the proposed model outperforms existing subword- and character-based architectures and showed significant improvements on translating from German, Russian, and Turkish into English.
Anthology ID:
C18-1265
Volume:
Proceedings of the 27th International Conference on Computational Linguistics
Month:
August
Year:
2018
Address:
Santa Fe, New Mexico, USA
Editors:
Emily M. Bender, Leon Derczynski, Pierre Isabelle
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3134–3145
Language:
URL:
https://aclanthology.org/C18-1265
DOI:
Bibkey:
Cite (ACL):
Peyman Passban, Andy Way, and Qun Liu. 2018. Tailoring Neural Architectures for Translating from Morphologically Rich Languages. In Proceedings of the 27th International Conference on Computational Linguistics, pages 3134–3145, Santa Fe, New Mexico, USA. Association for Computational Linguistics.
Cite (Informal):
Tailoring Neural Architectures for Translating from Morphologically Rich Languages (Passban et al., COLING 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/fix-dup-bibkey/C18-1265.pdf