Jiyoung Lee
2022
Specializing Multi-domain NMT via Penalizing Low Mutual Information
Jiyoung Lee
|
Hantae Kim
|
Hyunchang Cho
|
Edward Choi
|
Cheonbok Park
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Multi-domain Neural Machine Translation (NMT) trains a single model with multiple domains. It is appealing because of its efficacy in handling multiple domains within one model. An ideal multi-domain NMT learns distinctive domain characteristics simultaneously, however, grasping the domain peculiarity is a non-trivial task. In this paper, we investigate domain-specific information through the lens of mutual information (MI) and propose a new objective that penalizes low MI to become higher. Our method achieved the state-of-the-art performance among the current competitive multi-domain NMT models. Also, we show our objective promotes low MI to be higher resulting in domain-specialized multi-domain NMT.
Search