Amit Gajbhiye
2021
Knowledge Distillation for Quality Estimation
Amit Gajbhiye
|
Marina Fomicheva
|
Fernando Alva-Manchego
|
Frédéric Blain
|
Abiola Obamuyide
|
Nikolaos Aletras
|
Lucia Specia
Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021
deepQuest-py: Large and Distilled Models for Quality Estimation
Fernando Alva-Manchego
|
Abiola Obamuyide
|
Amit Gajbhiye
|
Frédéric Blain
|
Marina Fomicheva
|
Lucia Specia
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing: System Demonstrations
We introduce deepQuest-py, a framework for training and evaluation of large and light-weight models for Quality Estimation (QE). deepQuest-py provides access to (1) state-of-the-art models based on pre-trained Transformers for sentence-level and word-level QE; (2) light-weight and efficient sentence-level models implemented via knowledge distillation; and (3) a web interface for testing models and visualising their predictions. deepQuest-py is available at https://github.com/sheffieldnlp/deepQuest-py under a CC BY-NC-SA licence.
Search
Co-authors
- Marina Fomicheva 2
- Fernando Alva-Manchego 2
- Frédéric Blain 2
- Abiola Obamuyide 2
- Lucia Specia 2
- show all...