Preprint Article Version 1 This version is not peer-reviewed

xLSTMTime: Long-Term Time Series Forecasting With xLSTM

Version 1 : Received: 13 July 2024 / Approved: 15 July 2024 / Online: 16 July 2024 (04:06:31 CEST)

How to cite: Alharthi, M.; Mahmood, A. xLSTMTime: Long-Term Time Series Forecasting With xLSTM. Preprints 2024, 2024071246. https://doi.org/10.20944/preprints202407.1246.v1 Alharthi, M.; Mahmood, A. xLSTMTime: Long-Term Time Series Forecasting With xLSTM. Preprints 2024, 2024071246. https://doi.org/10.20944/preprints202407.1246.v1

Abstract

In recent years, transformer-based models have gained prominence in multivariate long-term time series forecasting (LTSF), demonstrating significant advancements despite facing challenges such as high computational demands, difficulty in capturing temporal dynamics, and managing long-term dependencies. The emergence of LTSF-Linear, with its straightforward linear architecture, has notably outperformed transformer-based counterparts, prompting a reevaluation of the transformer's utility in time series forecasting. In response, this paper presents an adaptation of a recent architecture termed extended LSTM (xLSTM) for LTSF. xLSTM incorporates exponential gating and a revised memory structure with higher capacity that has good potential for LTSF. Our adopted architecture for LTSF termed as xLSTMTime surpasses current approaches. We compare xLSTMTime's performance against various state-of-the-art models across multiple real-world datasets, demonstrating superior forecasting capabilities. Our findings suggest that refined recurrent architectures can offer competitive alternatives to transformer-based models in LTSF tasks, potentially redefining the landscape of time series forecasting.

Keywords

xLSTM; transformer; linear network; time series forecasting; state-space model

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.