Skip to main navigation Skip to search Skip to main content

Temporal Knowledge Graph Embedding with Pre-trained Language Model

  • Wenying Feng
  • , Jianming Li
  • , Haiyan Wang
  • , Zhaoquan Gu*
  • *Corresponding author for this work
  • Pengcheng Laboratory
  • Harbin Institute of Technology

Research output: Contribution to journalConference articlepeer-review

Abstract

Large language models (LLMs) have demonstrated exceptional performance in natural language processing. This also leads to extensive research on knowledge extraction, knowledge fusion, knowledge representation, and knowledge completion using pre-trained language models (PLMs). Most of the existing works focus on static multi-relational knowledge graphs (KGs). In contrast, temporal knowledge graphs (TKGs) incorporate temporal information, whereas lack of research utilizing PLMs or LLMs. In this paper, we introduce PT2KGC, a temporal knowledge graph embedding model which employs the pre-trained language model for TKG completion and extrapolation. We present three modeling approaches of PT2KGC to model temporal knowledge: original knowledge embedding, explicit time modeling, and implicit time modeling. PT2KGC(Org.) relies solely on static knowledge; PT2KGC(Exp.) explicitly incorporates timestamps into quadruples; and PT2KGC(Imp.) models time implicitly through dataset reconstruction. We conduct experiments on two public TKG datasets. The results demonstrate the effectiveness of pre-trained language models for TKG embedding. Experiment results on three types of tasks show that all three modeling methods of PT2KGC outperform existing models. Additionally, we compare the performance of PT2KGC under different time modeling approaches.

Original languageEnglish
Pages (from-to)332-345
Number of pages14
JournalProcedia Computer Science
Volume264
DOIs
StatePublished - 2025
Externally publishedYes
EventInternational Neural Network Society Workshop on Deep Learning Innovations and Applications, IJCNN 2025 - Rome, Italy
Duration: 30 Jun 20255 Jul 2025

Keywords

  • Knowledge graph
  • knowledge graph representation
  • pre-trained language model
  • temporal knowledge graph

Fingerprint

Dive into the research topics of 'Temporal Knowledge Graph Embedding with Pre-trained Language Model'. Together they form a unique fingerprint.

Cite this