Yujia Bai
2020
Knowledge Graph Embedding with Atrous Convolution and Residual Learning
Feiliang Ren
|
Juchen Li
|
Huihui Zhang
|
Shilei Liu
|
Bochao Li
|
Ruicheng Ming
|
Yujia Bai
Proceedings of the 28th International Conference on Computational Linguistics
Knowledge graph embedding is an important task and it will benefit lots of downstream applications. Currently, deep neural networks based methods achieve state-of-the-art performance. However, most of these existing methods are very complex and need much time for training and inference. To address this issue, we propose a simple but effective atrous convolution based knowledge graph embedding method. Compared with existing state-of-the-art methods, our method has following main characteristics. First, it effectively increases feature interactions by using atrous convolutions. Second, to address the original information forgotten issue and vanishing/exploding gradient issue, it uses the residual learning method. Third, it has simpler structure but much higher parameter efficiency. We evaluate our method on six benchmark datasets with different evaluation metrics. Extensive experiments show that our model is very effective. On these diverse datasets, it achieves better results than the compared state-of-the-art methods on most of evaluation metrics. The source codes of our model could be found at https://github.com/neukg/AcrE.
Search
Co-authors
- Feiliang Ren 1
- Juchen Li 1
- Huihui Zhang 1
- Shilei Liu 1
- Bochao Li 1
- show all...