Abstract
Neural models for distantly supervised relation extraction (DS-RE) encode each sentence in an entity-pair bag separately. These are then aggregated for bag-level relation prediction. Since, at encoding time, these approaches do not allow information to flow from other sentences in the bag, we believe that they do not utilize the available bag data to the fullest. In response, we explore a simple baseline approach (PARE) in which all sentences of a bag are concatenated into a passage of sentences, and encoded jointly using BERT. The contextual embeddings of tokens are aggregated using attention with the candidate relation as query – this summary of whole passage predicts the candidate relation. We find that our simple baseline solution outperforms existing state-of-the-art DS-RE models in both monolingual and multilingual DS-RE datasets.- Anthology ID:
- 2022.acl-short.38
- Volume:
- Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
- Month:
- May
- Year:
- 2022
- Address:
- Dublin, Ireland
- Editors:
- Smaranda Muresan, Preslav Nakov, Aline Villavicencio
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 340–354
- Language:
- URL:
- https://aclanthology.org/2022.acl-short.38
- DOI:
- 10.18653/v1/2022.acl-short.38
- Cite (ACL):
- Vipul Rathore, Kartikeya Badola, Parag Singla, and Mausam. 2022. PARE: A Simple and Strong Baseline for Monolingual and Multilingual Distantly Supervised Relation Extraction. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 340–354, Dublin, Ireland. Association for Computational Linguistics.
- Cite (Informal):
- PARE: A Simple and Strong Baseline for Monolingual and Multilingual Distantly Supervised Relation Extraction (Rathore et al., ACL 2022)
- PDF:
- https://preview.aclanthology.org/corrections-2024-07/2022.acl-short.38.pdf
- Code
- dair-iitd/dsre
- Data
- DiS-ReX