RGL: A Simple yet Effective Relation Graph Augmented Prompt-based Tuning Approach for Few-Shot Learning

Yaqing Wang, Xin Tian, Haoyi Xiong, Yueyang Li, Zeyu Chen, Sheng Guo, Dejing Dou


Abstract
Pre-trained language models (PLMs) can provide a good starting point for downstream applications. However, it is difficult to generalize PLMs to new tasks given a few labeled samples. In this work, we show that Relation Graph augmented Learning (RGL) can improve the performance of few-shot natural language understanding tasks. During learning, RGL constructs a relation graph based on the label consistency between samples in the same batch, and learns to solve the resultant node classification and link prediction problems on the relation graph. In this way, RGL fully exploits the limited supervised information, which can boost the tuning effectiveness. Extensive experimental results show that RGL consistently improves the performance of prompt-based tuning strategies.
Anthology ID:
2022.findings-naacl.81
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1078–1084
Language:
URL:
https://aclanthology.org/2022.findings-naacl.81
DOI:
10.18653/v1/2022.findings-naacl.81
Bibkey:
Cite (ACL):
Yaqing Wang, Xin Tian, Haoyi Xiong, Yueyang Li, Zeyu Chen, Sheng Guo, and Dejing Dou. 2022. RGL: A Simple yet Effective Relation Graph Augmented Prompt-based Tuning Approach for Few-Shot Learning. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 1078–1084, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
RGL: A Simple yet Effective Relation Graph Augmented Prompt-based Tuning Approach for Few-Shot Learning (Wang et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2022.findings-naacl.81.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-4/2022.findings-naacl.81.mp4
Data
GLUE