TY - GEN
T1 - Generating Time Series by Using Latent Space
AU - Cui, Xinyu
AU - Zhang, Chunkai
N1 - Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2024.
PY - 2024
Y1 - 2024
N2 - Time series forecasting is a crucial aspect of analyzing time series data, enabling predictions about future trends. Deep learning methods, particularly the transformer model, have become popular in time series forecasting. However, most existing models are discriminative, focusing on the relationship between past and future values. In contrast, time series data is generated from a high-dimensional latent space. This paper introduces LaTrans, a novel transformer-based time series forecasting model. Unlike previous models, LaTrans leverages the concept of latent space, where future time series can be generated. The model combines the power of latent space and transformer architectures, using attention layers to extract probability distributions and compress them into the latent space. Furthermore, it demonstrates the importance of the latent space in time series forecasting and shows that future time series can be generated within this space. The paper compares LaTrans with other transformer-based methods and investigates the influence of the KL divergence weight on forecasting results. These findings contribute to advancing the field of time series forecasting, highlighting the benefits of incorporating latent space and providing a new model that outperforms existing transformer-based approaches.
AB - Time series forecasting is a crucial aspect of analyzing time series data, enabling predictions about future trends. Deep learning methods, particularly the transformer model, have become popular in time series forecasting. However, most existing models are discriminative, focusing on the relationship between past and future values. In contrast, time series data is generated from a high-dimensional latent space. This paper introduces LaTrans, a novel transformer-based time series forecasting model. Unlike previous models, LaTrans leverages the concept of latent space, where future time series can be generated. The model combines the power of latent space and transformer architectures, using attention layers to extract probability distributions and compress them into the latent space. Furthermore, it demonstrates the importance of the latent space in time series forecasting and shows that future time series can be generated within this space. The paper compares LaTrans with other transformer-based methods and investigates the influence of the KL divergence weight on forecasting results. These findings contribute to advancing the field of time series forecasting, highlighting the benefits of incorporating latent space and providing a new model that outperforms existing transformer-based approaches.
KW - Forecasting
KW - Machine learning
KW - Time series
UR - https://www.scopus.com/pages/publications/85201190510
U2 - 10.1007/978-981-97-5666-7_22
DO - 10.1007/978-981-97-5666-7_22
M3 - 会议稿件
AN - SCOPUS:85201190510
SN - 9789819756650
T3 - Lecture Notes in Computer Science
SP - 258
EP - 268
BT - Advanced Intelligent Computing Technology and Applications - 20th International Conference, ICIC 2024, Proceedings
A2 - Huang, De-Shuang
A2 - Pan, Yijie
A2 - Zhang, Chuanlei
PB - Springer Science and Business Media Deutschland GmbH
T2 - 20th International Conference on Intelligent Computing, ICIC 2024
Y2 - 5 August 2024 through 8 August 2024
ER -