Good Meta-tasks Make A Better Cross-lingual Meta-transfer Learning for Low-resource Languages
Linjuan Wu, Zongyi Guo, Baoliang Cui, Haihong Tang, Weiming Lu
Abstract
Model-agnostic meta-learning has garnered attention as a promising technique for enhancing few-shot cross-lingual transfer learning in low-resource scenarios. However, little attention was paid to the impact of data selection strategies on this cross-lingual meta-transfer method, particularly the sampling of cross-lingual meta-training data (i.e. meta-tasks) at the syntactic level to reduce language gaps. In this paper, we propose a Meta-Task Collector-based Cross-lingual Meta-Transfer framework (MeTaCo-XMT) to adapt different data selection strategies to construct meta-tasks for meta-transfer learning. Syntactic differences have an effect on transfer performance, so we consider a syntactic similarity sampling strategy and propose a syntactic distance metric model consisting of a syntactic encoder block based on the pre-trained model and a distance metric block using Word Move’s Distance (WMD). Additionally, we conduct experiments with three different data selection strategies to instantiate our framework and analyze their performance impact. Experimental results on two multilingual NLP datasets, Wikiann and TydiQA, demonstrate the significant superiority of our approach compared to existing strong baselines.- Anthology ID:
- 2023.findings-emnlp.498
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2023
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Houda Bouamor, Juan Pino, Kalika Bali
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 7431–7446
- Language:
- URL:
- https://aclanthology.org/2023.findings-emnlp.498
- DOI:
- 10.18653/v1/2023.findings-emnlp.498
- Cite (ACL):
- Linjuan Wu, Zongyi Guo, Baoliang Cui, Haihong Tang, and Weiming Lu. 2023. Good Meta-tasks Make A Better Cross-lingual Meta-transfer Learning for Low-resource Languages. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 7431–7446, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- Good Meta-tasks Make A Better Cross-lingual Meta-transfer Learning for Low-resource Languages (Wu et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/naacl24-info/2023.findings-emnlp.498.pdf