Abstract
Low-shot relation extraction (RE) aims to recognize novel relations with very few or even no samples, which is critical in real scenario application. Few-shot and zero-shot RE are two representative low-shot RE tasks, which seem to be with similar target but require totally different underlying abilities. In this paper, we propose Multi-Choice Matching Networks to unify low-shot relation extraction. To fill in the gap between zero-shot and few-shot RE, we propose the triplet-paraphrase meta-training, which leverages triplet paraphrase to pre-train zero-shot label matching ability and uses meta-learning paradigm to learn few-shot instance summarizing ability. Experimental results on three different low-shot RE tasks show that the proposed method outperforms strong baselines by a large margin, and achieve the best performance on few-shot RE leaderboard.- Anthology ID:
- 2022.acl-long.397
- Volume:
- Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
- Month:
- May
- Year:
- 2022
- Address:
- Dublin, Ireland
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 5785–5795
- Language:
- URL:
- https://aclanthology.org/2022.acl-long.397
- DOI:
- 10.18653/v1/2022.acl-long.397
- Cite (ACL):
- Fangchao Liu, Hongyu Lin, Xianpei Han, Boxi Cao, and Le Sun. 2022. Pre-training to Match for Unified Low-shot Relation Extraction. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 5785–5795, Dublin, Ireland. Association for Computational Linguistics.
- Cite (Informal):
- Pre-training to Match for Unified Low-shot Relation Extraction (Liu et al., ACL 2022)
- PDF:
- https://preview.aclanthology.org/ingestion-script-update/2022.acl-long.397.pdf
- Code
- fc-liu/mcmn
- Data
- FewRel