Train Hard, Finetune Easy: Multilingual Denoising for RDF-to-Text Generation

Zdeněk Kasner, Ondřej Dušek


Abstract
We describe our system for the RDF-to-text generation task of the WebNLG Challenge 2020. We base our approach on the mBART model, which is pre-trained for multilingual denoising. This allows us to use a simple, identical, end-to-end setup for both English and Russian. Requiring minimal taskor languagespecific effort, our model placed in the first third of the leaderboard for English and first or second for Russian on automatic metrics, and it made it into the best or second-best system cluster on human evaluation.
Anthology ID:
2020.webnlg-1.20
Volume:
Proceedings of the 3rd International Workshop on Natural Language Generation from the Semantic Web (WebNLG+)
Month:
12
Year:
2020
Address:
Dublin, Ireland (Virtual)
Venue:
WebNLG
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
171–176
Language:
URL:
https://aclanthology.org/2020.webnlg-1.20
DOI:
Bibkey:
Cite (ACL):
Zdeněk Kasner and Ondřej Dušek. 2020. Train Hard, Finetune Easy: Multilingual Denoising for RDF-to-Text Generation. In Proceedings of the 3rd International Workshop on Natural Language Generation from the Semantic Web (WebNLG+), pages 171–176, Dublin, Ireland (Virtual). Association for Computational Linguistics.
Cite (Informal):
Train Hard, Finetune Easy: Multilingual Denoising for RDF-to-Text Generation (Kasner & Dušek, WebNLG 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/starsem-semeval-split/2020.webnlg-1.20.pdf