Context-aware Neural Machine Translation with Mini-batch Embedding

Makoto Morishita, Jun Suzuki, Tomoharu Iwata, Masaaki Nagata


Abstract
It is crucial to provide an inter-sentence context in Neural Machine Translation (NMT) models for higher-quality translation. With the aim of using a simple approach to incorporate inter-sentence information, we propose mini-batch embedding (MBE) as a way to represent the features of sentences in a mini-batch. We construct a mini-batch by choosing sentences from the same document, and thus the MBE is expected to have contextual information across sentences. Here, we incorporate MBE in an NMT model, and our experiments show that the proposed method consistently outperforms the translation capabilities of strong baselines and improves writing style or terminology to fit the document’s context.
Anthology ID:
2021.eacl-main.214
Volume:
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume
Month:
April
Year:
2021
Address:
Online
Editors:
Paola Merlo, Jorg Tiedemann, Reut Tsarfaty
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2513–2521
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/2021.eacl-main.214/
DOI:
10.18653/v1/2021.eacl-main.214
Bibkey:
Cite (ACL):
Makoto Morishita, Jun Suzuki, Tomoharu Iwata, and Masaaki Nagata. 2021. Context-aware Neural Machine Translation with Mini-batch Embedding. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, pages 2513–2521, Online. Association for Computational Linguistics.
Cite (Informal):
Context-aware Neural Machine Translation with Mini-batch Embedding (Morishita et al., EACL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/2021.eacl-main.214.pdf
Code
 nttcslab-nlp/mbe-nmt
Data
ASPEC