Effectiveness of Data Augmentation and Pretraining for Improving Neural Headline Generation in Low-Resource Settings

Matej Martinc, Syrielle Montariol, Lidia Pivovarova, Elaine Zosa


Abstract
We tackle the problem of neural headline generation in a low-resource setting, where only limited amount of data is available to train a model. We compare the ideal high-resource scenario on English with results obtained on a smaller subset of the same data and also run experiments on two small news corpora covering low-resource languages, Croatian and Estonian. Two options for headline generation in a multilingual low-resource scenario are investigated: a pretrained multilingual encoder-decoder model and a combination of two pretrained language models, one used as an encoder and the other as a decoder, connected with a cross-attention layer that needs to be trained from scratch. The results show that the first approach outperforms the second one by a large margin. We explore several data augmentation and pretraining strategies in order to improve the performance of both models and show that while we can drastically improve the second approach using these strategies, they have little to no effect on the performance of the pretrained encoder-decoder model. Finally, we propose two new measures for evaluating the performance of the models besides the classic ROUGE scores.
Anthology ID:
2022.lrec-1.381
Volume:
Proceedings of the Thirteenth Language Resources and Evaluation Conference
Month:
June
Year:
2022
Address:
Marseille, France
Venue:
LREC
SIG:
Publisher:
European Language Resources Association
Note:
Pages:
3561–3570
Language:
URL:
https://aclanthology.org/2022.lrec-1.381
DOI:
Bibkey:
Cite (ACL):
Matej Martinc, Syrielle Montariol, Lidia Pivovarova, and Elaine Zosa. 2022. Effectiveness of Data Augmentation and Pretraining for Improving Neural Headline Generation in Low-Resource Settings. In Proceedings of the Thirteenth Language Resources and Evaluation Conference, pages 3561–3570, Marseille, France. European Language Resources Association.
Cite (Informal):
Effectiveness of Data Augmentation and Pretraining for Improving Neural Headline Generation in Low-Resource Settings (Martinc et al., LREC 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2022.lrec-1.381.pdf
Data
KPTimes