Mukul Kumar
2020
Learn to Cross-lingual Transfer with Meta Graph Learning Across Heterogeneous Languages
Zheng Li
|
Mukul Kumar
|
William Headden
|
Bing Yin
|
Ying Wei
|
Yu Zhang
|
Qiang Yang
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Recent emergence of multilingual pre-training language model (mPLM) has enabled breakthroughs on various downstream cross-lingual transfer (CLT) tasks. However, mPLM-based methods usually involve two problems: (1) simply fine-tuning may not adapt general-purpose multilingual representations to be task-aware on low-resource languages; (2) ignore how cross-lingual adaptation happens for downstream tasks. To address the issues, we propose a meta graph learning (MGL) method. Unlike prior works that transfer from scratch, MGL can learn to cross-lingual transfer by extracting meta-knowledge from historical CLT experiences (tasks), making mPLM insensitive to low-resource languages. Besides, for each CLT task, MGL formulates its transfer process as information propagation over a dynamic graph, where the geometric structure can automatically capture intrinsic language relationships to explicitly guide cross-lingual transfer. Empirically, extensive experiments on both public and real-world datasets demonstrate the effectiveness of the MGL method.
Search
Co-authors
- Bing Yin 1
- Qiang Yang 1
- William Headden 1
- Ying Wei 1
- Yu Zhang 1
- show all...
- Zheng Li 1