HittER: Hierarchical Transformers for Knowledge Graph Embeddings
Sanxing Chen, Xiaodong Liu, Jianfeng Gao, Jian Jiao, Ruofei Zhang, Yangfeng Ji
Abstract
This paper examines the challenging problem of learning representations of entities and relations in a complex multi-relational knowledge graph. We propose HittER, a Hierarchical Transformer model to jointly learn Entity-relation composition and Relational contextualization based on a source entity’s neighborhood. Our proposed model consists of two different Transformer blocks: the bottom block extracts features of each entity-relation pair in the local neighborhood of the source entity and the top block aggregates the relational information from outputs of the bottom block. We further design a masked entity prediction task to balance information from the relational context and the source entity itself. Experimental results show that HittER achieves new state-of-the-art results on multiple link prediction datasets. We additionally propose a simple approach to integrate HittER into BERT and demonstrate its effectiveness on two Freebase factoid question answering datasets.- Anthology ID:
- 2021.emnlp-main.812
- Volume:
- Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
- Month:
- November
- Year:
- 2021
- Address:
- Online and Punta Cana, Dominican Republic
- Editors:
- Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 10395–10407
- Language:
- URL:
- https://aclanthology.org/2021.emnlp-main.812
- DOI:
- 10.18653/v1/2021.emnlp-main.812
- Cite (ACL):
- Sanxing Chen, Xiaodong Liu, Jianfeng Gao, Jian Jiao, Ruofei Zhang, and Yangfeng Ji. 2021. HittER: Hierarchical Transformers for Knowledge Graph Embeddings. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 10395–10407, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
- Cite (Informal):
- HittER: Hierarchical Transformers for Knowledge Graph Embeddings (Chen et al., EMNLP 2021)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-2/2021.emnlp-main.812.pdf
- Code
- additional community code
- Data
- FB15k, FB15k-237, WN18, WN18RR