Yuchen Lyu
2024
GRADUAL: Granularity-aware Dual Prototype Learning for Better Few-Shot Relation Extraction
Zhiming Li
|
Yuchen Lyu
Findings of the Association for Computational Linguistics ACL 2024
Recent studies have shown that fusing text labels and context sentences is an effective method for learning prototype representations in few-shot relation extraction. However, the **inconsistency of prototype representations** across different few-shot tasks persists due to different context sentences for the same relation, even with the integration of text labels into prototype representations. Conversely, the text label for each relation is unique and consistent, 1)which prompts us to propose a **dual prototype learning method**. Unlike previous methods that only construct support-based prototypes, we additionally construct label-based prototypes. Furthermore, we introduce a graph-based prototype adjustment module to construct topological information between support-based and label-based prototypes, thereby generating a more effective similarity measure through a simple linear combination. In addition, relations of different granularities have different distribution widths in the same semantic space, the **imbalanced distribution in the semantic space** leads to a lack of comparability among relations. To create a more discriminative semantic space, 2)we propose a **granularity-aware prototype learning method** that unifies the distribution width of relations, making relations of different granularities have similar distribution widths. Experimental results on two public benchmark datasets show that our proposed methods achieve state-of-the-art performance in few-shot relation classification.
Search