Efficient Bilingual Generalization from Neural Transduction Grammar Induction

Yuchen Yan, Dekai Wu, Serkan Kumyol


Abstract
We introduce (1) a novel neural network structure for bilingual modeling of sentence pairs that allows efficient capturing of bilingual relationship via biconstituent composition, (2) the concept of neural network biparsing, which applies to not only machine translation (MT) but also to a variety of other bilingual research areas, and (3) the concept of a biparsing-backpropagation training loop, which we hypothesize that can efficiently learn complex biparse tree patterns. Our work distinguishes from sequential attention-based models, which are more traditionally found in neural machine translation (NMT) in three aspects. First, our model enforces compositional constraints. Second, our model has a smaller search space in terms of discovering bilingual relationships from bilingual sentence pairs. Third, our model produces explicit biparse trees, which enable transparent error analysis during evaluation and external tree constraints during training.
Anthology ID:
2019.iwslt-1.28
Volume:
Proceedings of the 16th International Conference on Spoken Language Translation
Month:
November 2-3
Year:
2019
Address:
Hong Kong
Editors:
Jan Niehues, Rolando Cattoni, Sebastian Stüker, Matteo Negri, Marco Turchi, Thanh-Le Ha, Elizabeth Salesky, Ramon Sanabria, Loic Barrault, Lucia Specia, Marcello Federico
Venue:
IWSLT
SIG:
SIGSLT
Publisher:
Association for Computational Linguistics
Note:
Pages:
Language:
URL:
https://aclanthology.org/2019.iwslt-1.28
DOI:
Bibkey:
Cite (ACL):
Yuchen Yan, Dekai Wu, and Serkan Kumyol. 2019. Efficient Bilingual Generalization from Neural Transduction Grammar Induction. In Proceedings of the 16th International Conference on Spoken Language Translation, Hong Kong. Association for Computational Linguistics.
Cite (Informal):
Efficient Bilingual Generalization from Neural Transduction Grammar Induction (Yan et al., IWSLT 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/2019.iwslt-1.28.pdf