Abstract
As research on utilizing human knowledge in natural language processing has attracted considerable attention in recent years, knowledge graph (KG) completion has come into the spotlight. Recently, a new knowledge graph completion method using a pre-trained language model, such as KG-BERT, is presented and showed high performance. However, its scores in ranking metrics such as Hits@k are still behind state-of-the-art models. We claim that there are two main reasons: 1) failure in sufficiently learning relational information in knowledge graphs, and 2) difficulty in picking out the correct answer from lexically similar candidates. In this paper, we propose an effective multi-task learning method to overcome the limitations of previous works. By combining relation prediction and relevance ranking tasks with our target link prediction, the proposed model can learn more relational properties in KGs and properly perform even when lexical similarity occurs. Experimental results show that we not only largely improve the ranking performances compared to KG-BERT but also achieve the state-of-the-art performances in Mean Rank and Hits@10 on the WN18RR dataset.- Anthology ID:
- 2020.coling-main.153
- Volume:
- Proceedings of the 28th International Conference on Computational Linguistics
- Month:
- December
- Year:
- 2020
- Address:
- Barcelona, Spain (Online)
- Editors:
- Donia Scott, Nuria Bel, Chengqing Zong
- Venue:
- COLING
- SIG:
- Publisher:
- International Committee on Computational Linguistics
- Note:
- Pages:
- 1737–1743
- Language:
- URL:
- https://aclanthology.org/2020.coling-main.153
- DOI:
- 10.18653/v1/2020.coling-main.153
- Cite (ACL):
- Bosung Kim, Taesuk Hong, Youngjoong Ko, and Jungyun Seo. 2020. Multi-Task Learning for Knowledge Graph Completion with Pre-trained Language Models. In Proceedings of the 28th International Conference on Computational Linguistics, pages 1737–1743, Barcelona, Spain (Online). International Committee on Computational Linguistics.
- Cite (Informal):
- Multi-Task Learning for Knowledge Graph Completion with Pre-trained Language Models (Kim et al., COLING 2020)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-3/2020.coling-main.153.pdf
- Code
- bosung/mtl-kgc
- Data
- FB15k, FB15k-237, WN18, WN18RR