Isha Sharma
2022
Does Meta-learning Help mBERT for Few-shot Question Generation in a Cross-lingual Transfer Setting for Indic Languages?
Aniruddha Roy
|
Rupak Kumar Thakur
|
Isha Sharma
|
Ashim Gupta
|
Amrith Krishna
|
Sudeshna Sarkar
|
Pawan Goyal
Proceedings of the 29th International Conference on Computational Linguistics
Few-shot Question Generation (QG) is an important and challenging problem in the Natural Language Generation (NLG) domain. Multilingual BERT (mBERT) has been successfully used in various Natural Language Understanding (NLU) applications. However, the question of how to utilize mBERT for few-shot QG, possibly with cross-lingual transfer, remains. In this paper, we try to explore how mBERT performs in few-shot QG (cross-lingual transfer) and also whether applying meta-learning on mBERT further improves the results. In our setting, we consider mBERT as the base model and fine-tune it using a seq-to-seq language modeling framework in a cross-lingual setting. Further, we apply the model agnostic meta-learning approach to our base model. We evaluate our model for two low-resource Indian languages, Bengali and Telugu, using the TyDi QA dataset. The proposed approach consistently improves the performance of the base model in few-shot settings and even works better than some heavily parameterized models. Human evaluation also confirms the effectiveness of our approach.
Search
Co-authors
- Aniruddha Roy 1
- Rupak Kumar Thakur 1
- Ashim Gupta 1
- Amrith Krishna 1
- Sudeshna Sarkar 1
- show all...