Abstract
This paper presents our submission to the WMT 2022 quality estimation shared task and more specifically to the quality prediction sentence-level direct assessment (DA) subtask. We build a multilingual system based on the predictor–estimator architecture by using the XLM-RoBERTa transformer for feature extraction and a regression head on top of the final model to estimate the z-standardized DA labels. Furthermore, we use pretrained models to extract useful knowledge that reflect various criteria of quality assessment and demonstrate good correlation with human judgements. We optimize the performance of our model by incorporating this information as additional external features in the input data and by applying Monte Carlo dropout during both training and inference.- Anthology ID:
- 2022.wmt-1.62
- Volume:
- Proceedings of the Seventh Conference on Machine Translation (WMT)
- Month:
- December
- Year:
- 2022
- Address:
- Abu Dhabi, United Arab Emirates (Hybrid)
- Editors:
- Philipp Koehn, Loïc Barrault, Ondřej Bojar, Fethi Bougares, Rajen Chatterjee, Marta R. Costa-jussà, Christian Federmann, Mark Fishel, Alexander Fraser, Markus Freitag, Yvette Graham, Roman Grundkiewicz, Paco Guzman, Barry Haddow, Matthias Huck, Antonio Jimeno Yepes, Tom Kocmi, André Martins, Makoto Morishita, Christof Monz, Masaaki Nagata, Toshiaki Nakazawa, Matteo Negri, Aurélie Névéol, Mariana Neves, Martin Popel, Marco Turchi, Marcos Zampieri
- Venue:
- WMT
- SIG:
- SIGMT
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 653–660
- Language:
- URL:
- https://aclanthology.org/2022.wmt-1.62
- DOI:
- Cite (ACL):
- Eirini Zafeiridou and Sokratis Sofianopoulos. 2022. Welocalize-ARC/NKUA’s Submission to the WMT 2022 Quality Estimation Shared Task. In Proceedings of the Seventh Conference on Machine Translation (WMT), pages 653–660, Abu Dhabi, United Arab Emirates (Hybrid). Association for Computational Linguistics.
- Cite (Informal):
- Welocalize-ARC/NKUA’s Submission to the WMT 2022 Quality Estimation Shared Task (Zafeiridou & Sofianopoulos, WMT 2022)
- PDF:
- https://preview.aclanthology.org/ingest-acl-2023-videos/2022.wmt-1.62.pdf