Factorized Transformer for Multi-Domain Neural Machine Translation

Yongchao Deng, Hongfei Yu, Heng Yu, Xiangyu Duan, Weihua Luo


Abstract
Multi-Domain Neural Machine Translation (NMT) aims at building a single system that performs well on a range of target domains. However, along with the extreme diversity of cross-domain wording and phrasing style, the imperfections of training data distribution and the inherent defects of the current sequential learning process all contribute to making the task of multi-domain NMT very challenging. To mitigate these problems, we propose the Factorized Transformer, which consists of an in-depth factorization of the parameters of an NMT model, namely Transformer in this paper, into two categories: domain-shared ones that encode common cross-domain knowledge and domain-specific ones that are private for each constituent domain. We experiment with various designs of our model and conduct extensive validations on English to French open multi-domain dataset. Our approach achieves state-of-the-art performance and opens up new perspectives for multi-domain and open-domain applications.
Anthology ID:
2020.findings-emnlp.377
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4221–4230
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.377
DOI:
10.18653/v1/2020.findings-emnlp.377
Bibkey:
Cite (ACL):
Yongchao Deng, Hongfei Yu, Heng Yu, Xiangyu Duan, and Weihua Luo. 2020. Factorized Transformer for Multi-Domain Neural Machine Translation. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 4221–4230, Online. Association for Computational Linguistics.
Cite (Informal):
Factorized Transformer for Multi-Domain Neural Machine Translation (Deng et al., Findings 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/remove-xml-comments/2020.findings-emnlp.377.pdf