Abstract
This paper introduces our neural machine translation systems’ participation in the WAT 2020 (team ID: goku20). We participated in the (i) Patent, (ii) Business Scene Dialogue (BSD) document-level translation, (iii) Mixed-domain tasks. Regardless of simplicity, standard Transformer models have been proven to be very effective in many machine translation systems. Recently, some advanced pre-training generative models have been proposed on the basis of encoder-decoder framework. Our main focus of this work is to explore how robust Transformer models perform in translation from sentence-level to document-level, from resource-rich to low-resource languages. Additionally, we also investigated the improvement that fine-tuning on the top of pre-trained transformer-based models can achieve on various tasks.- Anthology ID:
- 2020.wat-1.16
- Volume:
- Proceedings of the 7th Workshop on Asian Translation
- Month:
- December
- Year:
- 2020
- Address:
- Suzhou, China
- Editors:
- Toshiaki Nakazawa, Hideki Nakayama, Chenchen Ding, Raj Dabre, Anoop Kunchukuttan, Win Pa Pa, Ondřej Bojar, Shantipriya Parida, Isao Goto, Hidaya Mino, Hiroshi Manabe, Katsuhito Sudoh, Sadao Kurohashi, Pushpak Bhattacharyya
- Venue:
- WAT
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 135–141
- Language:
- URL:
- https://aclanthology.org/2020.wat-1.16
- DOI:
- Cite (ACL):
- Dongzhe Wang and Ohnmar Htun. 2020. Goku’s Participation in WAT 2020. In Proceedings of the 7th Workshop on Asian Translation, pages 135–141, Suzhou, China. Association for Computational Linguistics.
- Cite (Informal):
- Goku’s Participation in WAT 2020 (Wang & Htun, WAT 2020)
- PDF:
- https://preview.aclanthology.org/ingest-acl-2023-videos/2020.wat-1.16.pdf
- Data
- JESC