The objective of knowledge graph completion is to comprehend the structure and inherent relationships of domain knowledge, thereby providing a valuable foundation for knowledge reasoning and analysis. However, existing methods for knowledge graph completion face challenges. For instance, rule-based completion methods exhibit high accuracy and interpretability, but encounter difficulties when handling large knowledge graphs. In contrast, embedding-based completion methods demonstrate strong scalability and efficiency, but also have limited utilisation of domain knowledge. In response to the aforementioned issues, we propose a method of pre-training and inference for knowledge graph completion based on integrated rules. The approach combines rule mining and reasoning to generate precise candidate facts. Subsequently, a pre-trained language model is fine-tuned and probabilistic structural loss is incorporated to embed the knowledge graph. This enables the language model to capture more deep semantic information while the loss function reconstructs the structure of the knowledge graph. This enables the language model to capture more deep semantic information while the loss function reconstructs the structure of the knowledge graph. Extensive tests using various publicly accessible datasets have indicated that the suggested model performs better than current techniques in tackling knowledge graph completion problems.
X. Wang, D. Wang, C. Xu, X. He, Y. Cao, and T.-S. Chua, Explainable reasoning over knowledge graphs for recommendation, Proc. AAAI Conf. Artif. Intell., vol. 33, no. 1, pp. 5329–5336, 2019.
S. Ji, S. Pan, E. Cambria, P. Marttinen, and P. S. Yu, A survey on knowledge graphs: Representation, acquisition, and applications, IEEE Trans. Neural Netw. Learn. Syst., vol. 33, no. 2, pp. 494–514, 2022.
W. Dong, S. Sun, J. Zhao, and N. Zhang, Knowledge graph relation reasoning with variational reinforcement network, Inf. Fusion, vol. 100, p. 101900, 2023.
X. Liu, S. Du, T. Li, F. Teng, and Y. Yang, A missing value filling model based on feature fusion enhanced autoencoder, Appl. Intell., vol. 53, no. 21, pp. 24931–24946, 2023.
L. Galárraga, C. Teflioudi, K. Hose, and F. M. Suchanek, Fast rule mining in ontological knowledge bases with AMIE+, VLDB J., vol. 24, no. 6, pp. 707–730, 2015.
B. Choi, D. Jang, and Y. Ko, MEM-KGC: Masked entity model for knowledge graph completion with pre-trained language model, IEEE Access, vol. 9, pp. 132025–132032, 2021.
Y. Zhu, X. Wang, J. Chen, S. Qiao, Y. Ou, Y. Yao, S. Deng, H. Chen, and N. Zhang, LLMs for knowledge graph construction and reasoning: Recent capabilities and future opportunities, World Wide Web,, vol. 27, no. 5, p. 58, 2024.
S. Du, T. Yang, F. Teng, J. Zhang, T. Li, and Y. Zheng, Multi-scale feature enhanced spatio-temporal learning for traffic flow forecasting, Knowl. Based Syst., vol. 294, p. 111787, 2024.
J. Zhang and J. Li, Enhanced knowledge graph embedding by jointly learning soft rules and facts, Algorithms, vol. 12, no. 12, p. 265, 2019.
H. Cao, S. Du, J. Hu, Y. Yang, S.-J. Horng, and T. Li, Graph deep active learning framework for data deduplication, Big Data Mining and Analytics, vol. 7, no. 3, pp. 753–764, 2024.
Y. Lan, S. He, K. Liu, and J. Zhao, Knowledge reasoning via jointly modeling knowledge graphs and soft rules, Appl. Sci., vol. 13, no. 19, p. 10660, 2023.