Sort:
Open Access Issue
A Temporal Knowledge Graph Embedding Model Based on Variable Translation
Tsinghua Science and Technology 2024, 29(5): 1554-1565
Published: 02 May 2024
Abstract PDF (1 MB) Collect
Downloads:45

Knowledge representation learning (KRL) aims to encode entities and relationships in various knowledge graphs into low-dimensional continuous vectors. It is popularly used in knowledge graph completion (or link prediction) tasks. Translation-based knowledge representation learning methods perform well in knowledge graph completion (KGC). However, the translation principles adopted by these methods are too strict and cannot model complex entities and relationships (i.e., N-1, 1-N, and N-N) well. Besides, these traditional translation principles are primarily used in static knowledge graphs and overlook the temporal properties of triplet facts. Therefore, we propose a temporal knowledge graph embedding model based on variable translation (TKGE-VT). The model proposes a new variable translation principle, which enables flexible transformation between entities and relationship embedding. Meanwhile, this paper considers the temporal properties of both entities and relationships and applies the proposed principle of variable translation to temporal knowledge graphs. We conduct link prediction and triplet classification experiments on four benchmark datasets: WN11, WN18, FB13, and FB15K. Our model outperforms baseline models on multiple evaluation metrics according to the experimental results.

Total 1