Collect
Submit Manuscript
Show Outline
Figures (2)

Tables (4)
Table 1
Table 2
Table 3
Table 4
Open Access

RP-KGC: A Knowledge Graph Completion Model Integrating Rule-Based Knowledge for Pretraining and Inference

School of Computing and Artificial Intelligence, with Engineering Research Center of Sustainable Urban Intelligent Transportation Affiliated with Ministry of Education, and also with National Engineering Laboratory of Integrated Transportation Big Data Application Technology, Southwest Jiaotong University, Chengdu 611756, China
Show Author Information

Abstract

The objective of knowledge graph completion is to comprehend the structure and inherent relationships of domain knowledge, thereby providing a valuable foundation for knowledge reasoning and analysis. However, existing methods for knowledge graph completion face challenges. For instance, rule-based completion methods exhibit high accuracy and interpretability, but encounter difficulties when handling large knowledge graphs. In contrast, embedding-based completion methods demonstrate strong scalability and efficiency, but also have limited utilisation of domain knowledge. In response to the aforementioned issues, we propose a method of pre-training and inference for knowledge graph completion based on integrated rules. The approach combines rule mining and reasoning to generate precise candidate facts. Subsequently, a pre-trained language model is fine-tuned and probabilistic structural loss is incorporated to embed the knowledge graph. This enables the language model to capture more deep semantic information while the loss function reconstructs the structure of the knowledge graph. This enables the language model to capture more deep semantic information while the loss function reconstructs the structure of the knowledge graph. Extensive tests using various publicly accessible datasets have indicated that the suggested model performs better than current techniques in tackling knowledge graph completion problems.

References

[1]
L. Jiang and R. Usbeck, Knowledge graph question answering datasets and their generalizability: Are they enough for future research? in Proc. 45th Int. ACM SIGIR Conf. Research and Development in Information Retrieval, Madrid, Spain, 2022, pp. 3209–3218.
[2]

X. Wang, D. Wang, C. Xu, X. He, Y. Cao, and T.-S. Chua, Explainable reasoning over knowledge graphs for recommendation, Proc. AAAI Conf. Artif. Intell., vol. 33, no. 1, pp. 5329–5336, 2019.

[3]

S. Ji, S. Pan, E. Cambria, P. Marttinen, and P. S. Yu, A survey on knowledge graphs: Representation, acquisition, and applications, IEEE Trans. Neural Netw. Learn. Syst., vol. 33, no. 2, pp. 494–514, 2022.

[4]
X. Lv, L. Hou, J. Li, and Z. Liu, Differentiating concepts and instances for knowledge graph embedding, in Proc. 2018 Conf. Empirical Methods in Natural Language Processing, Brussels, Belgium, 2018, pp. 1971–1979.
[5]

W. Dong, S. Sun, J. Zhao, and N. Zhang, Knowledge graph relation reasoning with variational reinforcement network, Inf. Fusion, vol. 100, p. 101900, 2023.

[6]
C. Fu, T. Chen, M. Qu, W. Jin, and X. Ren, Collaborative policy learning for open knowledge graph reasoning, arXiv preprint arXiv: 1909.00230, 2019.
[7]

X. Liu, S. Du, T. Li, F. Teng, and Y. Yang, A missing value filling model based on feature fusion enhanced autoencoder, Appl. Intell., vol. 53, no. 21, pp. 24931–24946, 2023.

[8]
L. Yao, C. Mao, and Y. Luo, KG-BERT: BERT for knowledge graph completion, arXiv preprint 1909.03193, 2019.
[9]
L. A. Galárraga, C. Teflioudi, K. Hose, and F. Suchanek, AMIE: Association rule mining under incomplete evidence in ontological knowledge bases, in Proc. 22nd Int. Conf. World Wide Web, Rio de Janeiro, Brazil, 2013, pp. 413–422.
[10]

L. Galárraga, C. Teflioudi, K. Hose, and F. M. Suchanek, Fast rule mining in ontological knowledge bases with AMIE+, VLDB J., vol. 24, no. 6, pp. 707–730, 2015.

[11]
Z. Xu, P. Ye, H. Chen, M. Zhao, H. Chen, and W. Zhang, Ruleformer: Context-aware rule mining over knowledge graph, in Proc. of 29 th Int. Conf. Computational Linguistics, Gyeongju, Republic of Korea, 2022, pp. 2551–2560.
[12]
P.-W. Wang, D. Stepanova, C. Domokos, and J. Z. Kolter, Differentiable learning of numerical rules in knowledge graphs, in Proc. Int. Conf. on Learning Representations, https://api.semanticscholar.org/CorpusID:209318312, 2020.
[13]
F. Yang, Z. Yang, and W. W. Cohen, Differentiable learning of logical rules for knowledge base reasoning, in Proc. of 31st Int. Conf. Neural Information Processing Systems, Long Beach, CA, USA, 2017, pp. 2316–2325.
[14]
X. Xie, N. Zhang, Z. Li, S. Deng, H. Chen, F. Xiong, M. Chen, and H. Chen, From discrimination to generation: Knowledge graph completion with generative transformer, in Proc. Companion Proc. Web Conf. 2022, Virtual Event, 2022, pp. 162–165.
[15]
D. Daza, M. Cochez, and P. Groth, Inductive entity representations from text via link prediction, in Proc. Web Conf. 2021, Ljubljana, Slovenia, 2021, pp. 798–808.
[16]
L. Clouatre, P. Trempe, A. Zouaq, and S. Chandar, MLMLM: Link prediction with mean likelihood masked language model, arXiv preprint arXiv: 2009.07058, 2020.
[17]

B. Choi, D. Jang, and Y. Ko, MEM-KGC: Masked entity model for knowledge graph completion with pre-trained language model, IEEE Access, vol. 9, pp. 132025–132032, 2021.

[18]

Y. Zhu, X. Wang, J. Chen, S. Qiao, Y. Ou, Y. Yao, S. Deng, H. Chen, and N. Zhang, LLMs for knowledge graph construction and reasoning: Recent capabilities and future opportunities, World Wide Web,, vol. 27, no. 5, p. 58, 2024.

[19]

S. Du, T. Yang, F. Teng, J. Zhang, T. Li, and Y. Zheng, Multi-scale feature enhanced spatio-temporal learning for traffic flow forecasting, Knowl. Based Syst., vol. 294, p. 111787, 2024.

[20]
Z. Wang, J. Zhang, J. Feng, and Z. Chen, Knowledge graph embedding by translating on hyperplanes, in Proc. AAAI Conf. Artif. Intell., Québec City, Canada, 2014, pp 1112 –1119
[21]
Z. Sun, Z.-H. Deng, J.-Y. Nie, and J. Tang, RotatE: Knowledge graph embedding by relational rotation in complex space, in Proc. Int. Conf. on Learning Representations, doi: 10.48550/arXiv.1902.10197.
[22]
B. Yang, W. tau Yih, X. He, J. Gao, and L. Deng, Embedding entities and relations for learning and inference in knowledge bases, arXiv preprint arXiv: 1412.6575, 2015.
[23]
H. Kamigaito and K. Hayashi, Unified interpretation of softmax cross-entropy and negative sampling: With case study for knowledge graph embedding, in Proc. 59th Annual Meeting of the Association for Computational Linguistics and the 11th Int. Joint Conf. Natural Language Processing, Virtual Event, 2021, pp. 5517–5531.
[24]
S. Guo, Q. Wang, L. Wang, B. Wang, and L. Guo, Jointly embedding knowledge graphs and logical rules, in Proc. 2016 Conf. Empirical Methods in Natural Language Processing, Austin, TX, USA, 2016, pp. 192–202.
[25]

J. Zhang and J. Li, Enhanced knowledge graph embedding by jointly learning soft rules and facts, Algorithms, vol. 12, no. 12, p. 265, 2019.

[26]
S. Guo, Q. Wang, L. Wang, B. Wang, and L. Guo, Knowledge graph embedding with iterative guidance from soft rules, in Proc. AAAI Conf. Artif. Intell., arXiv preprint arXiv: 1711.11231, 2018.
[27]

H. Cao, S. Du, J. Hu, Y. Yang, S.-J. Horng, and T. Li, Graph deep active learning framework for data deduplication, Big Data Mining and Analytics, vol. 7, no. 3, pp. 753–764, 2024.

[28]
S. Liu, P. G. Omran, and K. Taylor, Data augmented knowledge graph completion via pre-trained language models, https://api.semanticscholar.org/CorpusID:267500798, 2023.
[29]

Y. Lan, S. He, K. Liu, and J. Zhao, Knowledge reasoning via jointly modeling knowledge graphs and soft rules, Appl. Sci., vol. 13, no. 19, p. 10660, 2023.

[30]
M. Qu, J. Chen, L.-P. Xhonneux, Y. Bengio, and J. Tang, RNNLogic: Learning logic rules for reasoning on knowledge graphs, arXiv preprint arXiv: 2010.04029, 2020.
[31]
C. Han, Q. He, C. Yu, X. Du, H. Tong, and H. Ji, Logical entity representation in knowledge-graphs for differentiable rule learning, arXiv preprint arXiv: 2305.12738, 2023.
[32]
T. Dettmers, P. Minervini, P. Stenetorp, and S. Riedel, Convolutional 2D knowledge graph embeddings, in Proc. Thirty-Second AAAI Conference on Artificial Intelligence and Thirtieth Innovative Applications of Artificial Intelligence Conference and Eighth AAAI Symposium on Educational Advances in Artificial Intelligence, New Orleans, LA, USA, 2018, pp. 1811 – 1818.
[33]
H. Gao, K. Yang, Y. Yang, R. Y. Zakari, J. W. Owusu, and K. Qin, QuatDE: Dynamic quaternion embedding for knowledge graph completion, arXiv preprint arXiv: 2105.09002, 2021.
[34]
J. Shen, C. Wang, Y. Yuan, J. Han, H. Ji, K. Sen, M. Zhang, and D. Song, PALT: Parameter-lite transfer of language models for knowledge graph completion, arXiv preprint arXiv: 2210.13715, 2022.
[35]
C. Chen, Y. Wang, A. Sun, B. Li, and K.-Y. Lam, Dipping PLMs sauce: Bridging structure and text for effective knowledge graph completion via conditional soft prompting, in Proc. Findings of the Association for Computational Linguistics, Toronto, Canada, 2023, 11489–11503.
Big Data Mining and Analytics
Pages 18-30
Cite this article:
Guo W, Du S, Hu J, et al. RP-KGC: A Knowledge Graph Completion Model Integrating Rule-Based Knowledge for Pretraining and Inference. Big Data Mining and Analytics, 2025, 8(1): 18-30. https://doi.org/10.26599/BDMA.2024.9020063
Metrics & Citations  
Article History
Copyright
Rights and Permissions
Return