Abstract
Large-scale pre-trained representations such as BERT have been widely used in many natural language understanding tasks. The methods of incorporating BERT into document-level machine translation are still being explored. BERT is able to understand sentence relationships since BERT is pre-trained using the next sentence prediction task. In our work, we leverage this property to improve document-level machine translation. In our proposed model, BERT performs as a context encoder to achieve document-level contextual information, which is then integrated into both the encoder and decoder. Experiment results show that our proposed method can significantly outperform strong document-level machine translation baselines on BLEU score. Moreover, the ablation study shows our method can capture document-level context information to boost translation performance.- Anthology ID:
- 2020.aacl-srw.15
- Volume:
- Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing: Student Research Workshop
- Month:
- December
- Year:
- 2020
- Address:
- Suzhou, China
- Editors:
- Boaz Shmueli, Yin Jou Huang
- Venue:
- AACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 101–107
- Language:
- URL:
- https://aclanthology.org/2020.aacl-srw.15
- DOI:
- Cite (ACL):
- Zhiyu Guo and Minh Le Nguyen. 2020. Document-Level Neural Machine Translation Using BERT as Context Encoder. In Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing: Student Research Workshop, pages 101–107, Suzhou, China. Association for Computational Linguistics.
- Cite (Informal):
- Document-Level Neural Machine Translation Using BERT as Context Encoder (Guo & Nguyen, AACL 2020)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-4/2020.aacl-srw.15.pdf