Abstract
This paper describes Octanove Labs’ submission to the IWSLT 2020 open domain translation challenge. In order to build a high-quality Japanese-Chinese neural machine translation (NMT) system, we use a combination of 1) parallel corpus filtering and 2) back-translation. We have shown that, by using heuristic rules and learned classifiers, the size of the parallel data can be reduced by 70% to 90% without much impact on the final MT performance. We have also shown that including the artificially generated parallel data through back-translation further boosts the metric by 17% to 27%, while self-training contributes little. Aside from a small number of parallel sentences annotated for filtering, no external resources have been used to build our system.- Anthology ID:
- 2020.iwslt-1.20
- Volume:
- Proceedings of the 17th International Conference on Spoken Language Translation
- Month:
- July
- Year:
- 2020
- Address:
- Online
- Venue:
- IWSLT
- SIG:
- SIGSLT
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 166–171
- Language:
- URL:
- https://aclanthology.org/2020.iwslt-1.20
- DOI:
- 10.18653/v1/2020.iwslt-1.20
- Cite (ACL):
- Masato Hagiwara. 2020. Octanove Labs’ Japanese-Chinese Open Domain Translation System. In Proceedings of the 17th International Conference on Spoken Language Translation, pages 166–171, Online. Association for Computational Linguistics.
- Cite (Informal):
- Octanove Labs’ Japanese-Chinese Open Domain Translation System (Hagiwara, IWSLT 2020)
- PDF:
- https://preview.aclanthology.org/ingestion-script-update/2020.iwslt-1.20.pdf