Chenyao Liu
2021
Learning Algebraic Recombination for Compositional Generalization
Chenyao Liu
|
Shengnan An
|
Zeqi Lin
|
Qian Liu
|
Bei Chen
|
Jian-Guang Lou
|
Lijie Wen
|
Nanning Zheng
|
Dongmei Zhang
Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021
Semi-supervised Relation Extraction via Incremental Meta Self-Training
Xuming Hu
|
Chenwei Zhang
|
Fukun Ma
|
Chenyao Liu
|
Lijie Wen
|
Philip S. Yu
Findings of the Association for Computational Linguistics: EMNLP 2021
To alleviate human efforts from obtaining large-scale annotations, Semi-Supervised Relation Extraction methods aim to leverage unlabeled data in addition to learning from limited samples. Existing self-training methods suffer from the gradual drift problem, where noisy pseudo labels on unlabeled data are incorporated during training. To alleviate the noise in pseudo labels, we propose a method called MetaSRE, where a Relation Label Generation Network generates accurate quality assessment on pseudo labels by (meta) learning from the successful and failed attempts on Relation Classification Network as an additional meta-objective. To reduce the influence of noisy pseudo labels, MetaSRE adopts a pseudo label selection and exploitation scheme which assesses pseudo label quality on unlabeled samples and only exploits high-quality pseudo labels in a self-training fashion to incrementally augment labeled samples for both robustness and accuracy. Experimental results on two public datasets demonstrate the effectiveness of the proposed approach.