AI Chat Paper
Note: Please note that the following content is generated by AMiner AI. SciOpen does not take any responsibility related to this content.
{{lang === 'zh_CN' ? '文章概述' : 'Summary'}}
{{lang === 'en_US' ? '中' : 'Eng'}}
Chat more with AI
PDF (3.5 MB)
Collect
Submit Manuscript AI Chat Paper
Show Outline
Outline
Show full outline
Hide outline
Outline
Show full outline
Hide outline
Open Access

Inductive Relation Prediction by Disentangled Subgraph Structure

Laboratory of Intelligent Collaborative Computing, University of Electronic Science and Technology of China, Chengdu 611731, China, and also with Trusted Cloud Computing and Big Data Key Laboratory of Sichuan Province, Chengdu 611731, China
School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China
Hangzhou NetEase Cloud Music Technology Co., Ltd., Hangzhou 310052, China
Department of Fundamental Courses, Chengdu Textile College, Chengdu 611700, China
Show Author Information

Abstract

Currently, most existing inductive relation prediction approaches are based on subgraph structures, with subgraph features extracted using graph neural networks to predict relations. However, subgraphs may contain disconnected regions, which usually represent different semantic ranges. Because not all semantic information about the regions is helpful in relation prediction, we propose a relation prediction model based on a disentangled subgraph structure and implement a feature updating approach based on relevant semantic aggregation. To indirectly achieve the disentangled subgraph structure from a semantic perspective, the mapping of entity features into different semantic spaces and the aggregation of related semantics on each semantic space are updated. The disentangled model can focus on features having higher semantic relevance in the prediction, thus addressing a problem with existing approaches, which ignore the semantic differences in different subgraph structures. Furthermore, using a gated recurrent neural network, this model enhances the features of entities by sorting them by distance and extracting the path information in the subgraphs. Experimentally, it is shown that when there are numerous disconnected regions in the subgraph, our model outperforms existing mainstream models in terms of both Area Under the Curve-Precision-Recall (AUC-PR) and Hits@10. Experiments prove that semantic differences in the knowledge graph can be effectively distinguished and verify the effectiveness of this method.

References

[1]
M. Yasunaga, H. Ren, A. Bosselut, P. Liang, and J. Leskovec, QA-GNN: Reasoning with language models and knowledge graphs for question answering, arXiv preprint arXiv: 2104.06378, 2021.
[2]
Y. Xu, C. Zhu, R. Xu, Y. Liu, M. Zeng, and X. Huang, Fusing context into knowledge graph for commonsense question answering, arXiv preprint arXiv: 2012.04808v3, 2021.
[3]
W. Cui, Y. Xiao, H. Wang, Y. Song, S. Hwang, and W. Wang, KBQA: Learning question answering over QA corpora and knowledge bases, arXiv preprint arXiv: 1903.02419, 2019.
[4]
X. Wang, T. Huang, D. Wang, Y. Yuan, Z. Liu, X. He, and T. S. Chua, Learning intents behind interactions with knowledge graph for recommendation, in Proc. Web Conf. 2021, Ljubljana, Slovenia, 2021, pp. 878–887.
[5]

L. Xia, C. Huang, Y. Xu, P. Dai, X. Zhang, H. Yang, J. Pei, and L. Bo, Knowledge-enhanced hierarchical graph transformer network for multi-behavior recommendation, Proc. AAAI Conf. Artif. Intell., vol. 35, no. 5, pp. 4486–4493, 2021.

[6]

J. Guan, F. Huang, Z. Zhao, X. Zhu, and M. Huang, A knowledge-enhanced pretraining model for commonsense story generation, Trans. Assoc. Comput. Linguist., vol. 8, pp. 93–108, 2020.

[7]
H. Ji, P. Ke, S. Huang, F. Wei, X. Zhu, and M. Huang, Language generation with multi-hop reasoning on commonsense knowledge graph, arXiv preprint arXiv: 2009.11692, 2020.
[8]
N. Jain, Domain-specific knowledge graph construction for semantic analysis, in Proc. The Semantic Web : ESWC 2020 Satellite Events, Crete, Greece, 2020, pp. 250–260.
[9]

B. Xue and L. Zou, Knowledge graph quality management: A comprehensive survey, IEEE Trans. Knowl. Data Eng., vol. 35, no. 5, pp. 4969–4988, 2023.

[10]

Z. Jin, X. Zhao, and Y. Liu, Heterogeneous graph network embedding for sentiment analysis on social media, Cogn. Comput., vol. 13, no. 1, pp. 81–95, 2021.

[11]

F. Chen, Y. C. Wang, B. Wang, and C. J. Kuo, Graph representation learning: A survey, APSIPA Trans. Signal Inf. Process., vol. 9, no. 1, pp. 1–21, 2020.

[12]

S. Ji, S. Pan, E. Cambria, P. Marttinen, and P. S. Yu, A survey on knowledge graphs: Representation, acquisition, and applications, IEEE Trans. Neural Netw. Learn. Syst., vol. 33, no. 2, pp. 494–514, 2022.

[13]
T. Hamaguchi, H. Oiwa, M. Shimbo, and Y. Matsumoto, Knowledge transfer for out-of-knowledge-base entities: A graph neural network approach, in Proc. 26th Int. Joint Conf. Artificial Intelligence, Melbourne, Australia, 2017, pp. 1802–1808.
[14]

X. Wang, H. Wang, T. Huang, and J. Kurths, Neural-network-based adaptive tracking control for nonlinear multiagent systems: The observer case, IEEE Trans. Cybern., vol. 53, no. 1, pp. 138–150, 2023.

[15]

J. Zhou, G. Cui, S. Hu, Z. Zhang, C. Yang, Z. Liu, L. Wang, C. Li, and M. Sun, Graph neural networks: A review of methods and applications, AI Open, vol. 1, pp. 57–81, 2020.

[16]
P. Esser, R. Rombach, and B. Ommer, A disentangling invertible interpretation network for explaining latent representations, in Proc. IEEE/CVF Conf. Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 2020, pp. 9223–9232.
[17]
W. L. Hamilton, R. Ying, and J. Leskovec, Representation learning on graphs: Methods and applications, arXiv preprint arXiv: 1709.05584, 2018.
[18]
A. Bojchevski and S. Günnemann, Deep Gaussian embedding of graphs: Unsupervised inductive learning via ranking, arXiv preprint arXiv: 1707.03815, 2018.
[19]

P. Wang, J. Han, C. Li, and R. Pan, Logic attention based neighborhood aggregation for inductive knowledge graph embedding, Proc. AAAI Conf. Artif. Intell., vol. 33, no. 1, pp. 7152–7159, 2019.

[20]
F. Yang, Z. Yang, and W. W. Cohen, Differentiable learning of logical rules for knowledge base reasoning, in Proc. 31st Int. Conf. Neural Information Processing Systems, Long Beach, CA, USA, 2017, pp. 2316–2325.
[21]
A. Sadeghian, M. Armandpour, P. Ding, and D. Z. Wang, DRUM: End-to-end differentiable rule mining on knowledge graphs, in Proc. 33rd Conference on Neural Information Processing Systems, Vancouver, Canada, 2019, pp. 15347–15357.
[22]
K. K. Teru, E. G. Denis, and W. L. Hamilton, Inductive relation prediction by subgraph reasoning, in Proc. 37th Int. Conf. on Machine Learning, Vienna, Austria, 2020, pp. 9448–9457.
[23]
B. Wang, T. Shen, G. Long, T. Zhou, Y. Wang, and Y. Chang, Structure-augmented text representation learning for efficient knowledge graph completion, in Proc. Web Conf. 2021, Ljubljana, Slovenia, 2021, pp. 1737–1748.
[24]

H. Paulheim and C. Bizer, Improving the quality of linked data using statistical distributions, Int. J. Semant. Web Inf. Syst., vol. 10, no. 2, pp. 63–86, 2014.

[25]
C. Meilicke, M. Fink, Y. Wang, D. Ruffinelli, R. Gemulla, and H. Stuckenschmidt, Fine-grained evaluation of rule- and embedding-based systems for knowledge graph completion, in Proc. 17th International Semantic Web Conference, Monterey, CA, USA, 2018, pp. 3–20.
[26]

Q. Luo, D. Yu, A. Maradapu Vera Venkata Sai, Z. Cai, and X. Cheng, A survey of structural representation learning for social networks, Neurocomputing, vol. 496, pp. 56–71, 2022.

[27]
J. Chen, C. Wang, H. Lin, W. Wang, Z. Cai, and J. Wang, Learning the structures of online asynchronous conversations, in Proc. Database Systems for Advanced Applications : 22nd International Conference, DASFAA 2017, Suzhou, China, 2017, pp. 19–34.
[28]
W. W. Cohen, TensorLog: A differentiable deductive database, arXiv preprint arXiv: 1605.06523, 2016.
[29]

Q. Zhang, R. Wang, J. Yang, and L. Xue, Structural context-based knowledge graph embedding for link prediction, Neurocomputing, vol. 470, pp. 109–120, 2022.

[30]

S. Mai, S. Zheng, Y. Yang, and H. Hu, Communicative message passing for inductive relation reasoning, Proc. AAAI Conf. Artif. Intell., vol. 35, no. 5, pp. 4294–4302, 2021.

[31]
J. Chen, H. He, F. Wu, and J. Wang, Topology-aware correlations between relations for inductive link prediction in knowledge graphs, in Proc. AAAI Conference on Artificial Intelligence, online, 2021, pp. 6271–6278.
[32]

J. Wu, S. Mai, and H. Hu, Contextual relation embedding and interpretable triplet capsule for inductive relation prediction, Neurocomputing, vol. 505, pp. 80–91, 2022.

[33]
J. Guo, K. Han, Y. Wang, H. Wu, X. Chen, C. Xu, and C. Xu, Distilling object detectors via decoupled features, in Proc. IEEE/CVF Conference on Computer Vision and Pattern Recognition, online, 2021, pp. 2154–2164.
[34]
C. Bass, M. da Silva, C. Sudre, P. D. Tudosiu, S. M. Smith, and E. C. Robinson, ICAM: Interpretable classification via disentangled representations and feature attribution mapping, in Proc. 34th Int. Conf. Neural Information Processing Systems, Vancouver, Canada, 2020, pp. 7697–7709.
[35]
J. Ma, P. Cui, K. Kuang, X. Wang, and W. Zhu, Disentangled graph convolutional networks, in Proc. 36th International Conference on Machine Learning, Long Beach, CA, USA, 2019, pp. 4212–4221.
[36]
X. Wang, H. Jin, A. Zhang, X. He, T. Xu, and T. S. Chua, Disentangled graph collaborative filtering, in Proc. 43rd Int. ACM SIGIR Conf. Research and Development in Information Retrieval, Virtual Event, China, 2020, pp. 1001–1010.
[37]

Y. Liu, X. Wang, S. Wu, and Z. Xiao, Independence promoted graph disentangled networks, Proc. AAAI Conf. Artif. Intell., vol. 34, no. 4, pp. 4916–4923, 2020.

[38]
J. Wu, W. Shi, X. Cao, J. Chen, W. Lei, F. Zhang, W. Wu, and X. He, DisenKGAT: knowledge graph embedding with disentangled graph attention network, in Proc. 30th ACM Int. Conf. Information & Knowledge Management, Queensland, Australia, 2021, pp. 2140–2149.
[39]
X. Wang, H. Chen, and W. Zhu, Disentangled representation learning for multimedia, in Proc. 31st ACM Int. Conf. Multimedia, Ottawa, Canada, 2023, pp. 9702–9704.
[40]

M. Jia, X. Cheng, S. Lu, and J. Zhang, Learning disentangled representation implicitly via transformer for occluded person re-identification, IEEE Trans. Multimed., vol. 25, pp. 1294–1305, 2023.

[41]
M. Schlichtkrull, T. N. Kipf, P. Bloem, R. van den Berg, I. Titov, and M. Welling, Modeling relational data with graph convolutional networks, in Porc. 15th International Conference, ESWC 2018, Crete, Greece, 2018, pp. 593–607.
[42]
Y. Song, S. Zheng, Z. Niu, Z. H. Fu, Y. Lu, and Y. Yang, Communicative representation learning on attributed molecular graphs, in Proc. Twenty-Ninth Int. Joint Conf. Artificial Intelligence, Yokohama, Japan, 2021, pp. 2831–2838.
[43]

T. Dettmers, P. Minervini, P. Stenetorp, and S. Riedel, Convolutional 2D knowledge graph embeddings, Proc. AAAI Conf. Artif. Intell., vol. 32, no. 1, pp. 1811–1818, 2018.

[44]
K. Toutanova and D. Chen, Observed versus latent features for knowledge base and text inference, in Proc. 3rd Workshop on Continuous Vector Space Models and their Compositionality, Beijing, China, 2015, pp. 57–66.
[45]
W. Xiong, T. Hoang, and W. Y. Wang, DeepPath: A reinforcement learning method for knowledge graph reasoning, in Proc. 2017 Conf. Empirical Methods in Natural Language Processing, Copenhagen, Denmark, 2017, pp. 564–573.
[46]
M. Galkin, E. Denis, J. Wu, and W. L. Hamilton, NodePiece: compositional and parameter-efficient representations of large knowledge graphs, arXiv preprint arXiv: 2106.12144, 2022.
[47]
M. Galkin, M. Berrendorf, and C. T. Hoyt, An open challenge for inductive link prediction on knowledge graphs, arXiv preprint arXiv: 2203.01520, 2022.
[48]
N. De Cao, G. Lzacard, S. Riedel, and F. Petroni, Autoregressive Entity Retrieval, arXiv preprint arXiv: 2010.00904, 2021.
[49]
S. Vashishth, S. Sanyal, V. Nitin, and P. Talukdar, Composition-based Multi-Relational Graph Convolutional Networks, arXiv preprint arXiv: 1911.03082, 2020.
[50]
B. Yang, W. Yih, X. He, J. Gao, L. Deng, Embedding Entities and Relations for Learning and Inference in Knowledge Bases, arXiv preprint arXiv: 1412.6575, 2015.
Tsinghua Science and Technology
Pages 1566-1579
Cite this article:
Duan G, Guo R, Luo W, et al. Inductive Relation Prediction by Disentangled Subgraph Structure. Tsinghua Science and Technology, 2024, 29(5): 1566-1579. https://doi.org/10.26599/TST.2023.9010154

173

Views

15

Downloads

0

Crossref

0

Web of Science

0

Scopus

0

CSCD

Altmetrics

Received: 26 July 2023
Revised: 03 December 2023
Accepted: 18 December 2023
Published: 02 May 2024
© The Author(s) 2024.

The articles published in this open access journal are distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/).

Return