skip to main content
research-article

Knowledge graph embeddings based on 2d convolution and self-attention mechanisms for link prediction: Knowledge graph embeddings based on 2d convolution...

Published: 09 December 2024 Publication History

Abstract

Link prediction refers to using existing facts in the knowledge graph to predict missing facts. This process can enhance the integrity of the knowledge graph and facilitate various downstream applications. However, existing link prediction models usually extract features only in a global or local scope, resulting in feature extraction being limited to a single scope. Additionally, to achieve optimal results, many models require increasing embedding dimensions and parameter numbers, which can lead to scalability issues when applied to large knowledge graphs. To address these issues, we propose a model that fuses the self-attention mechanism with 2D convolution for the link prediction task. The model utilizes a self-attention mechanism with numerous heads to capture feature interactions between entities and relations in the global scope. Furthermore, we innovatively introduce 2D convolution to capture feature interactions in the local scope. Results using FB15k-237 and WN18RR as standard link prediction benchmarks show that our fusion model has good comparable performance with current state-of-the-art models. In particular, compared to the ConvE model (which uses only 2D convolution), our proposed model achieves 13.7% and 14.7% improvement in MRR metrics, and compared to the SAttLE model (which uses only the self-attention mechanism) achieves 2.5% and 0.5% improvement in MRR metrics. Furthermore, due to the low-dimensional embedding of entities and relations, our proposed model has low complexity, good scalability, and thus can accomplish link prediction tasks on larger knowledge graphs in the real world.

References

[1]
Zhong L, Wu J, Li Q, Peng H, and Wu X A comprehensive survey on automatic knowledge graph construction ACM Comput Surv 2023 56 4 1-62
[2]
Zhao X, Chen H, Xing Z, and Miao C Brain-inspired search engine assistant based on knowledge graph IEEE Transactions on Neural Networks and Learning Systems. 2023 34 8 4386-4400
[3]
Liu J, Schmid F, Li K, and Zheng W A knowledge graph-based approach for exploring railway operational accidents Reliability Engineering & System Safety. 2021 207 107352
[4]
Chen X, Jia S, and Xiang Y A review: Knowledge reasoning over knowledge graph Expert Syst Appl 2020 141 112948
[5]
Bollacker K, Evans C, Paritosh P, Sturge T, Taylor J (2008) Freebase: a collaboratively created graph database for structuring human knowledge. In: Proceedings of the 2008 ACM SIGMOD International Conference on Management of Data. SIGMOD/PODS ’08.
[6]
Vrandečić D and Krötzsch M Wikidata: a free collaborative knowledgebase Commun ACM 2014 57 10 78-85
[7]
Wang Q, Ji Y, Hao Y, and Cao J Grl: Knowledge graph completion with gan-based reinforcement learning Knowl-Based Syst 2020 209 106421
[8]
Shen T, Zhang F, and Cheng J A comprehensive overview of knowledge graph completion Knowl-Based Syst 2022 255 109597
[9]
Lu H, Hu H, and Lin X Dense: An enhanced non-commutative representation for knowledge graph embedding with adaptive semantic hierarchy Neurocomputing 2022 476 115-125
[10]
Sha X, Sun Z, and Zhang J Hierarchical attentive knowledge graph embedding for personalized recommendation Electron Commer Res Appl 2021 48 101071
[11]
Ji S, Pan S, Cambria E, Marttinen P, and Yu PS A survey on knowledge graphs: Representation, acquisition, and applications IEEE Transactions on Neural Networks and Learning Systems. 2022 33 2 494-514
[12]
Bordes A, Usunier N, Garcia-Duran A, Weston J, Yakhnenko O (2013) Translating embeddings for modeling multi-relational data. Advances in neural information processing systems 26
[13]
Sun Z, Deng Z-H, Nie J-Y, Tang J (2019) Rotate: Knowledge graph embedding by relational rotation in complex space. In: International Conference on Learning Representations
[14]
Balazevic I, Allen C, Hospedales T (2019) Tucker: Tensor factorization for knowledge graph completion. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP).
[15]
Dettmers T, Minervini P, Stenetorp P, Riedel S (2018) Convolutional 2d knowledge graph embeddings. Proceedings of the AAAI Conference on Artificial Intelligence. 32(1).
[16]
Jiang X, Wang Q, Wang B (2019) Adaptive convolution for multi-relational learning. In: Proceedings of the 2019 Conference of the North.
[17]
Zhou J, Cui G, Hu S, Zhang Z, Yang C, Liu Z, Wang L, Li C, and Sun M Graph neural networks: A review of methods and applications AI Open. 2020 1 57-81
[18]
Vashishth S, Sanyal S, Nitin V, Talukdar P (2020) Composition-based multi-relational graph convolutional networks. In: International Conference on Learning Representations
[19]
Nathani D, Chauhan J, Sharma C, Kaul M (2019) Learning attention-based embeddings for relation prediction in knowledge graphs. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics .
[20]
Wang X, He Q, Liang J, Xiao Y (2022) Language models as knowledge embeddings. In: Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence. IJCAI-2022.
[21]
Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser L, Polosukhin I (2017) Attention is all you need. Advances in neural information processing systems. 30
[22]
Baghershahi P, Hosseini R, and Moradi H Self-attention presents low-dimensional knowledge graph embeddings for link prediction Knowl-Based Syst 2023 260 110124
[23]
Bi Z, Cheng S, Chen J, Liang X, Xiong F, and Zhang N Relphormer: Relational graph transformer for knowledge graph representations Neurocomputing 2024 566 127044
[24]
Chen S, Liu X, Gao J, Jiao J, Zhang R, Ji Y (2021) Hitter: Hierarchical transformers for knowledge graph embeddings. In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing.
[25]
Balažević I, Allen C, Hospedales TM (2019) Hypernetwork Knowledge Graph Embeddings, pp. 553–565.
[26]
Balažević I, Allen C, Hospedales T (2019) Multi-relational poincaré graph embeddings. Advances in Neural Information Processing Systems 32 (NIPS 2019). 32:4465–4475
[27]
Chami I, Wolf A, Juan D-C, Sala F, Ravi S, Ré C (2020) Low-dimensional hyperbolic knowledge graph embeddings. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics.
[28]
Vashishth S, Sanyal S, Nitin V, Agrawal N, and Talukdar P Interacte: Improving convolution-based knowledge graph embeddings by increasing feature interactions Proceedings of the AAAI Conference on Artificial Intelligence. 2020 34 03 3009-3016
[29]
Wang L, Zhao W, Wei Z, Liu J (2022) Simkgc: Simple contrastive knowledge graph completion with pre-trained language models. In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers).
[30]
Das R, Dhuliawala S, Zaheer M, Vilnis L, Durugkar I, Krishnamurthy A, Smola AJ, McCallum A (2018) Go for a walk and arrive at the answer: Reasoning over paths in knowledge bases using reinforcement learning. In: International Conference on Learning Representations
[31]
Sadeghian A, Armandpour M, Ding P, Wang DZ (2019) Drum: End-to-end differentiable rule mining on knowledge graphs. Adv Neural Inf Process Syst 32:15321–15331
[32]
Pan X, Ge C, Lu R, Song S, Chen G, Huang Z, Huang G (2022) On the integration of self-attention and convolution. In: 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[33]
Wu L, Li J, Wang Y, Meng Q, Qin T, Chen W, Zhang M, Liu T-Y, et al. R-drop: Regularized dropout for neural networks Adv Neural Inf Process Syst 2021 34 10890-10905
[34]
Toutanova K, Chen D, Pantel P, Poon H, Choudhury P, Gamon M (2015) Representing text for joint embedding of text and knowledge bases. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing.
[35]
Rossi A, Barbosa D, Firmani D, Matinata A, and Merialdo P Knowledge graph embedding for link prediction: A comparative analysis ACM Trans Knowl Discov Data 2021 15 2 1-49
[36]
Paszke A, Gross S, Massa F, Lerer A, Bradbury J, Chanan G, Killeen T, Lin Z, Gimelshein N, Antiga L, Desmaison A, Köpf A, Yang E, DeVito Z, Raison M, Tejani A, Chilamkurthy S, Steiner B, Fang L, Bai J, Chintala S (2019) Pytorch: An imperative style, high-performance deep learning library. Advances in neural information processing systems. 32

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Applied Intelligence
Applied Intelligence  Volume 55, Issue 2
Jan 2025
1239 pages

Publisher

Kluwer Academic Publishers

United States

Publication History

Published: 09 December 2024
Accepted: 02 October 2024

Author Tags

  1. Knowledge graph
  2. Low-dimensional embedding
  3. 2D convolution
  4. Multi-head self-attention mechanism
  5. Feature interaction
  6. R-Drop structure

Qualifiers

  • Research-article

Funding Sources

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 31 Jan 2025

Other Metrics

Citations

View Options

View options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media