Sam Brody
2022
Enhanced Distant Supervision with State-Change Information for Relation Extraction
Jui Shah
|
Dongxu Zhang
|
Sam Brody
|
Andrew McCallum
Proceedings of the Thirteenth Language Resources and Evaluation Conference
In this work, we introduce a method for enhancing distant supervision with state-change information for relation extraction. We provide a training dataset created via this process, along with manually annotated development and test sets. We present an analysis of the curation process and data, and compare it to standard distant supervision. We demonstrate that the addition of state-change information reduces noise when used for static relation extraction, and can also be used to train a relation-extraction system that detects a change of state in relations.
2021
Towards Realistic Few-Shot Relation Extraction
Sam Brody
|
Sichao Wu
|
Adrian Benton
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
In recent years, few-shot models have been applied successfully to a variety of NLP tasks. Han et al. (2018) introduced a few-shot learning framework for relation classification, and since then, several models have surpassed human performance on this task, leading to the impression that few-shot relation classification is solved. In this paper we take a deeper look at the efficacy of strong few-shot classification models in the more common relation extraction setting, and show that typical few-shot evaluation metrics obscure a wide variability in performance across relations. In particular, we find that state of the art few-shot relation classification models overly rely on entity type information, and propose modifications to the training routine to encourage models to better discriminate between relations involving similar entity types.
Search