Towards Zero-Shot Multilingual Synthetic Question and Answer Generation for Cross-Lingual Reading Comprehension

Siamak Shakeri, Noah Constant, Mihir Kale, Linting Xue


Abstract
We propose a simple method to generate multilingual question and answer pairs on a large scale through the use of a single generative model. These synthetic samples can be used to improve the zero-shot performance of multilingual QA models on target languages. Our proposed multi-task training of the generative model only requires labeled training samples in English, thus removing the need for such samples in the target languages, making it applicable to far more languages than those with labeled data. Human evaluations indicate the majority of such samples are grammatically correct and sensible. Experimental results show our proposed approach can achieve large gains on the XQuAD dataset, reducing the gap between zero-shot and supervised performance of smaller QA models on various languages.
Anthology ID:
2021.inlg-1.4
Volume:
Proceedings of the 14th International Conference on Natural Language Generation
Month:
August
Year:
2021
Address:
Aberdeen, Scotland, UK
Editors:
Anya Belz, Angela Fan, Ehud Reiter, Yaji Sripada
Venue:
INLG
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
35–45
Language:
URL:
https://aclanthology.org/2021.inlg-1.4
DOI:
10.18653/v1/2021.inlg-1.4
Bibkey:
Cite (ACL):
Siamak Shakeri, Noah Constant, Mihir Kale, and Linting Xue. 2021. Towards Zero-Shot Multilingual Synthetic Question and Answer Generation for Cross-Lingual Reading Comprehension. In Proceedings of the 14th International Conference on Natural Language Generation, pages 35–45, Aberdeen, Scotland, UK. Association for Computational Linguistics.
Cite (Informal):
Towards Zero-Shot Multilingual Synthetic Question and Answer Generation for Cross-Lingual Reading Comprehension (Shakeri et al., INLG 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-3/2021.inlg-1.4.pdf
Data
C4MLQASQuADTyDiQAXQuADmC4