Knowledge-Aware Meta-learning for Low-Resource Text Classification

Huaxiu Yao, Ying-xin Wu, Maruan Al-Shedivat, Eric Xing


Abstract
Meta-learning has achieved great success in leveraging the historical learned knowledge to facilitate the learning process of the new task. However, merely learning the knowledge from the historical tasks, adopted by current meta-learning algorithms, may not generalize well to testing tasks when they are not well-supported by training tasks. This paper studies a low-resource text classification problem and bridges the gap between meta-training and meta-testing tasks by leveraging the external knowledge bases. Specifically, we propose KGML to introduce additional representation for each sentence learned from the extracted sentence-specific knowledge graph. The extensive experiments on three datasets demonstrate the effectiveness of KGML under both supervised adaptation and unsupervised adaptation settings.
Anthology ID:
2021.emnlp-main.136
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1814–1821
Language:
URL:
https://aclanthology.org/2021.emnlp-main.136
DOI:
10.18653/v1/2021.emnlp-main.136
Bibkey:
Cite (ACL):
Huaxiu Yao, Ying-xin Wu, Maruan Al-Shedivat, and Eric Xing. 2021. Knowledge-Aware Meta-learning for Low-Resource Text Classification. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 1814–1821, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Knowledge-Aware Meta-learning for Low-Resource Text Classification (Yao et al., EMNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2021.emnlp-main.136.pdf
Video:
 https://preview.aclanthology.org/auto-file-uploads/2021.emnlp-main.136.mp4
Code
 huaxiuyao/KGML