Abstract
Large language models (LLMs) have demonstrated exceptional performance in natural language processing. This also leads to extensive research on knowledge extraction, knowledge fusion, knowledge representation, and knowledge completion using pre-trained language models (PLMs). Most of the existing works focus on static multi-relational knowledge graphs (KGs). In contrast, temporal knowledge graphs (TKGs) incorporate temporal information, whereas lack of research utilizing PLMs or LLMs. In this paper, we introduce PT2KGC, a temporal knowledge graph embedding model which employs the pre-trained language model for TKG completion and extrapolation. We present three modeling approaches of PT2KGC to model temporal knowledge: original knowledge embedding, explicit time modeling, and implicit time modeling. PT2KGC(Org.) relies solely on static knowledge; PT2KGC(Exp.) explicitly incorporates timestamps into quadruples; and PT2KGC(Imp.) models time implicitly through dataset reconstruction. We conduct experiments on two public TKG datasets. The results demonstrate the effectiveness of pre-trained language models for TKG embedding. Experiment results on three types of tasks show that all three modeling methods of PT2KGC outperform existing models. Additionally, we compare the performance of PT2KGC under different time modeling approaches.
| Original language | English |
|---|---|
| Pages (from-to) | 332-345 |
| Number of pages | 14 |
| Journal | Procedia Computer Science |
| Volume | 264 |
| DOIs | |
| State | Published - 2025 |
| Externally published | Yes |
| Event | International Neural Network Society Workshop on Deep Learning Innovations and Applications, IJCNN 2025 - Rome, Italy Duration: 30 Jun 2025 → 5 Jul 2025 |
Keywords
- Knowledge graph
- knowledge graph representation
- pre-trained language model
- temporal knowledge graph
Fingerprint
Dive into the research topics of 'Temporal Knowledge Graph Embedding with Pre-trained Language Model'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver