Alexander Molchanov


2022

pdf
PROMT Systems for WMT22 General Translation Task
Alexander Molchanov | Vladislav Kovalenko | Natalia Makhamalkina
Proceedings of the Seventh Conference on Machine Translation (WMT)

The PROMT systems are trained with the MarianNMT toolkit. All systems use the transformer-big configuration. We use BPE for text encoding, the vocabulary sizes vary from 24k to 32k for different language pairs. All systems are unconstrained. We use all data provided by the WMT organizers, all publicly available data and some private data. We participate in four directions: English-Russian, English-German and German-English, Ukrainian-English.

2021

pdf
PROMT Systems for WMT21 Terminology Translation Task
Alexander Molchanov | Vladislav Kovalenko | Fedor Bykov
Proceedings of the Sixth Conference on Machine Translation

This paper describes the PROMT submissions for the WMT21 Terminology Translation Task. We participate in two directions: English to French and English to Russian. Our final submissions are MarianNMT-based neural systems. We present two technologies for terminology translation: a modification of the Dinu et al. (2019) soft-constrained approach and our own approach called PROMT Smart Neural Dictionary (SmartND). We achieve good results in both directions.

2020

pdf
PROMT Systems for WMT 2020 Shared News Translation Task
Alexander Molchanov
Proceedings of the Fifth Conference on Machine Translation

This paper describes the PROMT submissions for the WMT 2020 Shared News Translation Task. This year we participated in four language pairs and six directions: English-Russian, Russian-English, English-German, German-English, Polish-English and Czech-English. All our submissions are MarianNMT-based neural systems. We use more data compared to last year and update our back-translations with better models from the previous year. We show competitive results in terms of BLEU in most directions.

2019

pdf
PROMT Systems for WMT 2019 Shared Translation Task
Alexander Molchanov
Proceedings of the Fourth Conference on Machine Translation (Volume 2: Shared Task Papers, Day 1)

This paper describes the PROMT submissions for the WMT 2019 Shared News Translation Task. This year we participated in two language pairs and in three directions: English-Russian, English-German and German-English. All our submissions are Marian-based neural systems. We use significantly more data compared to the last year. We also present our improved data filtering pipeline.

2018

pdf
PROMT Systems for WMT 2018 Shared Translation Task
Alexander Molchanov
Proceedings of the Third Conference on Machine Translation: Shared Task Papers

This paper describes the PROMT submissions for the WMT 2018 Shared News Translation Task. This year we participated only in the English-Russian language pair. We built two primary neural networks-based systems: 1) a pure Marian-based neural system and 2) a hybrid system which incorporates OpenNMT-based neural post-editing component into our RBMT engine. We also submitted pure rule-based translation (RBMT) for contrast. We show competitive results with both primary submissions which significantly outperform the RBMT baseline.

2016

pdf
PROMT Translation Systems for WMT 2016 Translation Tasks
Alexander Molchanov | Fedor Bykov
Proceedings of the First Conference on Machine Translation: Volume 2, Shared Task Papers

2012

pdf
PROMT DeepHybrid system for WMT12 shared translation task
Alexander Molchanov
Proceedings of the Seventh Workshop on Statistical Machine Translation