Pretrained Knowledge Base Embeddings for improved Sentential Relation Extraction
Andrea Papaluca, Daniel Krefl, Hanna Suominen, Artem Lenskiy
Abstract
In this work we put forward to combine pretrained knowledge base graph embeddings with transformer based language models to improve performance on the sentential Relation Extraction task in natural language processing. Our proposed model is based on a simple variation of existing models to incorporate off-task pretrained graph embeddings with an on-task finetuned BERT encoder. We perform a detailed statistical evaluation of the model on standard datasets. We provide evidence that the added graph embeddings improve the performance, making such a simple approach competitive with the state-of-the-art models that perform explicit on-task training of the graph embeddings. Furthermore, we ob- serve for the underlying BERT model an interesting power-law scaling behavior between the variance of the F1 score obtained for a relation class and its support in terms of training examples.- Anthology ID:
- 2022.acl-srw.29
- Volume:
- Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop
- Month:
- May
- Year:
- 2022
- Address:
- Dublin, Ireland
- Editors:
- Samuel Louvan, Andrea Madotto, Brielen Madureira
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 373–382
- Language:
- URL:
- https://aclanthology.org/2022.acl-srw.29
- DOI:
- 10.18653/v1/2022.acl-srw.29
- Cite (ACL):
- Andrea Papaluca, Daniel Krefl, Hanna Suominen, and Artem Lenskiy. 2022. Pretrained Knowledge Base Embeddings for improved Sentential Relation Extraction. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop, pages 373–382, Dublin, Ireland. Association for Computational Linguistics.
- Cite (Informal):
- Pretrained Knowledge Base Embeddings for improved Sentential Relation Extraction (Papaluca et al., ACL 2022)
- PDF:
- https://preview.aclanthology.org/naacl-24-ws-corrections/2022.acl-srw.29.pdf
- Code
- brunoliegibastonliegi/pretrained-kb-embeddings-for-re