Continual Quality Estimation with Online Bayesian Meta-Learning

Abiola Obamuyide, Marina Fomicheva, Lucia Specia


Abstract
Most current quality estimation (QE) models for machine translation are trained and evaluated in a static setting where training and test data are assumed to be from a fixed distribution. However, in real-life settings, the test data that a deployed QE model would be exposed to may differ from its training data. In particular, training samples are often labelled by one or a small set of annotators, whose perceptions of translation quality and needs may differ substantially from those of end-users, who will employ predictions in practice. To address this challenge, we propose an online Bayesian meta-learning framework for the continuous training of QE models that is able to adapt them to the needs of different users, while being robust to distributional shifts in training and test data. Experiments on data with varying number of users and language characteristics validate the effectiveness of the proposed approach.
Anthology ID:
2021.acl-short.25
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers)
Month:
August
Year:
2021
Address:
Online
Editors:
Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
190–197
Language:
URL:
https://aclanthology.org/2021.acl-short.25
DOI:
10.18653/v1/2021.acl-short.25
Bibkey:
Cite (ACL):
Abiola Obamuyide, Marina Fomicheva, and Lucia Specia. 2021. Continual Quality Estimation with Online Bayesian Meta-Learning. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pages 190–197, Online. Association for Computational Linguistics.
Cite (Informal):
Continual Quality Estimation with Online Bayesian Meta-Learning (Obamuyide et al., ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-3/2021.acl-short.25.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-3/2021.acl-short.25.mp4