Abstract
The WMT 2023 Shared Task on Low-Resource Indic Language Translation featured to and from Assamese, Khasi, Manipuri, Mizo on one side and English on the other. We submitted systems supervised neural machine translation systems for each pair and direction and experimented with different configurations and settings for both preprocessing and training. Even if most of them did not reach competitive performance, our experiments uncovered some interesting points for further investigation, namely the relation between dataset and model size, and the impact of the training framework. Moreover, the results of some of our preliminary experiments on the use of word embeddings initialization, backtranslation, and model depth were in contrast with previous work. The final results also show some disagreement in the automated metrics employed in the evaluation.- Anthology ID:
- 2023.wmt-1.91
- Volume:
- Proceedings of the Eighth Conference on Machine Translation
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Philipp Koehn, Barry Haddow, Tom Kocmi, Christof Monz
- Venue:
- WMT
- SIG:
- SIGMT
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 959–966
- Language:
- URL:
- https://aclanthology.org/2023.wmt-1.91
- DOI:
- 10.18653/v1/2023.wmt-1.91
- Cite (ACL):
- Edoardo Signoroni and Pavel Rychly. 2023. MUNI-NLP Systems for Low-resource Indic Machine Translation. In Proceedings of the Eighth Conference on Machine Translation, pages 959–966, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- MUNI-NLP Systems for Low-resource Indic Machine Translation (Signoroni & Rychly, WMT 2023)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-4/2023.wmt-1.91.pdf