BiLSTM-SDTCN-AutoCorr: A Hybrid Model for Stock Price Prediction Integrating Sequence Decomposition and Autocorrelation Attention
DOI:
https://doi.org/10.62051/ijcsit.v8n1.07Keywords:
Transformer, Hybrid Neural Network, Stock Price Prediction, Auto-Correlation Attention, Deep LearningAbstract
Stock price prediction faces substantial challenges due to nonlinearity, non-stationarity, and noise contamination. Traditional econometric models and early Deep Learning methods struggle to effectively capture complex temporal patterns. This paper proposes a novel hybrid Neural Network, BiLSTM-SDTCN-AutoCorr, which refines a BiLSTM–Transformer backbone: a sequence decomposition module partitions the input series into trend and seasonal components to filter noise and enhance pattern separation; the vanilla self-attention mechanism is replaced by autocorrelation attention to efficiently capture periodic dependencies via the Fast Fourier transform; and the Transformer decoder is modified into Temporal Convolutional Network layers to strengthen local sequence modeling. The model is evaluated on five stock index datasets, and the results demonstrate significant superiority across evaluation metrics. The proposed model offers an efficient and robust solution for stock prediction with potential practical applicability.
Downloads
References
[1] Javaid, H.A. (2024) Ai-driven predictive analytics in finance: Transforming risk assessment and decision-making. Advances in Computer Sciences. 7 (1), 1–9.
[2] Thakkar, A. and Chaudhari, K. (2021) Fusion in stock market prediction: A decade survey on the necessity, recent developments, and potential future directions. Information Fusion. 65 95–107.
[3] Hu, Z., Liu, W., Bian, J., Liu, X., & Liu, T. Y. (2018, February). Listening to chaotic whispers: A deep learning framework for news-oriented stock trend prediction. In Proceedings of the eleventh ACM international conference on web search and data mining (pp. 261-269).
[4] Bontempi, G., Ben Taieb, S., and Le Borgne, Y.-A. (2013) Machine Learning Strategies for Time Series Forecasting. in: M.-A. Aufaure, E. Zimányi (Eds.), Bus. Intell. Second Eur. Summer Sch. EBISS 2012 Bruss. Belg. July 15-21 2012 Tutor. Lect., Springer Berlin Heidelberg, Berlin, Heidelbergpp. 62–77.
[5] Stărică, C. and Granger, C. (2005) Nonstationarities in Stock Returns. The Review of Economics and Statistics. 87 (3), 503–522.
[6] Babu, C.N. and Reddy, B.E. (2015) Prediction of selected Indian stock using a partitioning–interpolation based ARIMA–GARCH model. Applied Computing and Informatics. 11 (2), 130–143.
[7] Thakkar, A. and Chaudhari, K. (2021) A comprehensive survey on deep neural networks for stock market: The need, challenges, and future directions. Expert Systems with Applications. 177 114800.
[8] Zhang, J., Ye, L., and Lai, Y. (2023) Stock price prediction using CNN-BiLSTM-Attention model. Mathematics. 11 (9), 1985.
[9] Lu, W., Li, J., Wang, J., and Qin, L. (2021) A CNN-BiLSTM-AM method for stock price prediction. Neural Computing and Applications. 33 (10), 4741–4753.
[10] Kamalov, F. (2020) Forecasting significant stock price changes using neural networks. Neural Computing and Applications. 32 (23), 17655–17667.
[11] Liu, H. and Long, Z. (2020) An improved deep learning model for predicting stock market price time series. Digital Signal Processing. 102 102741.
[12] Gülmez, B. (2023) Stock price prediction with optimized deep LSTM network with artificial rabbits optimization algorithm. Expert Systems with Applications. 227 120346.
[13] Jiang, W. (2021) Applications of deep learning in stock market prediction: recent progress. Expert Systems with Applications. 184 115537.
[14] Liu, J., Lin, H., Liu, X., Xu, B., Ren, Y., Diao, Y., & Yang, L. (2019, August). Transformer-based capsule network for stock movement prediction. In Proceedings of the first workshop on financial technology and natural language processing (pp. 66-73).
[15] Dosovitskiy, A., Beyer, L., Kolesnikov, A., Weissenborn, D., Zhai, X., Unterthiner, T., et al. (2020) An image is worth 16x16 words: Transformers for image recognition at scale. arXiv Preprint arXiv:2010.11929.
[16] Zhang, Q., Qin, C., Zhang, Y., Bao, F., Zhang, C., and Liu, P. (2022) Transformer-based attention network for stock movement prediction. Expert Systems with Applications. 202 117239.
[17] Yang J., Li P., Cui Y., Han X., and Zhou M. (2025) Multi-Sensor Temporal Fusion Transformer for Stock Performance Prediction: An Adaptive Sharpe Ratio Approach. Sensors. 25 (3), 976.
[18] Zhou, H., Zhang, S., Peng, J., Zhang, S., Li, J., Xiong, H., & Zhang, W. (2021, May). Informer: Beyond efficient transformer for long sequence time-series forecasting. In Proceedings of the AAAI conference on artificial intelligence (Vol. 35, No. 12, pp. 11106-11115).
[19] Zhou, T., Ma, Z., Wen, Q., Wang, X., Sun, L., & Jin, R. (2022, June). Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting. In International conference on machine learning (pp. 27268-27286). PMLR.
[20] Wu, H., Xu, J., Wang, J., & Long, M. (2021). Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting. Advances in neural information processing systems, 34, 22419-22430.
[21] Lea, C., Vidal, R., Reiter, A., and Hager, G.D. (2016) Temporal Convolutional Networks: A Unified Approach to Action Segmentation. in: G. Hua, H. Jégou (Eds.), Comput. Vis. – ECCV 2016 Workshop, Springer International Publishing, Champp. 47–54.
Downloads
Published
Issue
Section
License
Copyright (c) 2026 International Journal of Computer Science and Information Technology

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.







