The Analysis of Time Series Forecasting Based on MLP Models

Authors

  • Ruyi Xie

DOI:

https://doi.org/10.62051/qmrw8m76

Keywords:

Time Series Forecasting; Multi-layer Perceptron; Frequency-Domain; Deep Learning.

Abstract

The field of time series forecasting (TSF) increasingly leverages deep learning architectures. This study examines the latest advancements in Multi-layer Perceptron (MLP)-based models for TSF, focusing on the MLP Mixer, MLP Encoder-Decoder, and Frequency-Domain MLP models. Each model's principles are analyzed to identify commonalities and potential areas for improvement. These innovative approaches enhance the ability to capture features and global dependencies in complex and lengthy time series data through techniques such as dimension transformation, channel independence, and residual connections. This results in more accurate and stable predictions. The MLP Mixer alternates between temporal and feature dimensions; the MLP Encoder-Decoder emphasizes intensive information processing during encoding and decoding; and the Frequency-Domain MLP focuses on processing data from the time domain to the frequency domain. Experimental results reveal that the Frequency-Domain MLP model exhibits superior performance, though the choice of lookback window significantly affects results. Future research will aim to optimize these models further, exploring innovations in mixing, residual structures, and large-scale models. Enhancing the generalization ability and computational efficiency of these models will advance the field of TSF.

Downloads

Download data is not yet available.

References

[1] Wu H. Xu J. Wang J. et al. Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting. Advances in neural information processing systems, 2021, 34: 22419 - 22430.

[2] Zeng A. Chen M. Zhang L. et al. Are transformers effective for time series forecasting? Proceedings of the AAAI conference on artificial intelligence, 2023, 37 (9): 11121 - 11128.

[3] Nie Y. Nguyen N.H. Sinthong P. et al. A time series is worth 64 words: Long-term forecasting with transformers. 2022, arXiv preprint: 2211. 14730.

[4] Zhang Y. Yan J. Crossformer: Transformer utilizing cross-dimension dependency for multivariate time series forecasting. The eleventh international conference on learning representations, 2023.

[5] Liu Y. Hu T. Zhang H. et al. itransformer: Inverted transformers are effective for time series forecasting. 2023, arXiv preprint arXiv: 2310.06625.

[6] Chen S.A. Li C.L. Yoder N. et al. Tsmixer: An all-mlp architecture for time series forecasting. 2023, arXiv preprint: 2303. 06053.

[7] Das A. Kong W. Leach A. et al. Long-term forecasting with tide: Time-series dense encoder. 2023, arXiv preprint: 2304.08424.

[8] Yi K. Zhang Q. Fan W. et al. Frequency-domain MLPs are more effective learners in time series forecasting. Advances in Neural Information Processing Systems, 2024, 36.

Downloads

Published

25-11-2024

How to Cite

Xie, R. (2024) “The Analysis of Time Series Forecasting Based on MLP Models”, Transactions on Computer Science and Intelligent Systems Research, 7, pp. 326–332. doi:10.62051/qmrw8m76.