BiGRU-Att: Blood Glucose Prediction with Bidirectional Recurrent Neural Networks and Attention Mechanisms
DOI:
https://doi.org/10.62051/9zwa3x81Keywords:
BiGRU; Attention; Residual Blocks; Diabetes Management.Abstract
This paper presents an advanced deep learning model that combines a bidirectional gated recurrent unit (BiGRU) and an attention mechanism for predicting blood glucose levels. The innovation of the model lies in its unique structural design, in which the BiGRU is able to capture the backward and forward dependencies in time series data, while the attention mechanism further enhances the sensitivity of the model to critical time steps, thus improving the accuracy of the prediction. This combination not only exploits the power of BiGRU in processing serial data, but also enables adaptive weighting of important features in the input data through the attention mechanism. We validate the effectiveness of the model through experiments on 10 in silicodatasets generated by the UVA/Padova T1D simulator. On these datasets, the model demonstrated excellent performance with a root mean square error (RMSE) of only 0.0719, a metric that is significantly lower than that of existing techniques, proving the superiority and innovativeness of our model for blood glucose prediction tasks.
Downloads
References
[1] M. A. Atkinson, G. S. Eisenbarth, A. W. Michels, “Type 1 diabetes,”The Lancet, vol. 383, issue 9911, pp. 69-82, ISSN 0140-6736, 2014.
[2] E. I. Georga, V. C. Protopappas, D. Ardigo, M. Marina, I. Zavaroni,D. Polyzos, and D. I. Fotiadis, “Multivariate prediction of subcutaneous glucose concentration in type 1 diabetes patients based on support vectorregression,” IEEE Journal of Biomedical and Health Informatics, vol.17, no. 1, pp. 71–81, Jan 2013.
[3] Agatonovic-Kustrin S, Beresford R. Basic concepts of artificial neural network (ANN) modeling and its application in pharmaceutical research[J]. Journal of pharmaceutical and biomedical analysis, 2000, 22(5): 717-727.
[4] Cho K, Van Merriënboer B, Gulcehre C, et al. Learning phrase representations using RNN encoder-decoder for statistical machine translation[J]. arXiv preprint arXiv:1406.1078, 2014.
[5] Yu Y, Si X, Hu C, et al. A review of recurrent neural networks: LSTM cells and network architectures[J]. Neural computation, 2019, 31(7): 1235-1270..
[6] Taylor, Sean J, and Benjamin Letham. 2018. “Forecasting at scale.” The American Statistician72 (1): 37–45.
[7] Vaswani A, Shazeer N, Parmar N, et al. Attention is all you need[J]. Advances in neural information processing systems, 2017, 30.
[8] Chung, J., Gülçehre, Ç., Cho, K., & Bengio, Y. (2014). Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling. ArXiv, abs/1412.3555.
[9] Li, F., Zhang, M., Fu, G., Qian, T., & Ji, D. (2016). A Bi-LSTM-RNN Model for Relation Classification Using Low-Cost Sequence Features. ArXiv, abs/1608.07720.
[10] Kumar, J., Goomer, R., & Singh, A.K. (2018). Long Short Term Memory Recurrent Neural Network (LSTM-RNN) Based Workload Forecasting Model For Cloud Datacenters. Procedia Computer Science, 125, 676-682.
[11] Han J, Moraga C. The influence of the sigmoid function parameters on the speed of backpropagation learning[C]//International workshop on artificial neural networks. Berlin, Heidelberg: Springer Berlin Heidelberg, 1995: 195-201.
[12] Jamadar, D.J., Pittala, R.B., Silpa, D.G., Arunkumar, P., Ahmad, A., & Prasad, D.G. (2024). GRU-RNN Model to Analyze and Predict the Inflation by Consumer Price Index. Journal of Electrical Systems.
[13] Zhang, J., Wang, P., & Gao, R.X. (2020). Attention Mechanism-Incorporated Deep Learning for AM Part Quality Prediction. Procedia CIRP, 93, 96-101.
[14] Rojas R, Rojas R. The backpropagation algorithm[J]. Neural networks: a systematic introduction, 1996: 149-182.
[15] Le Q V, Ngiam J, Coates A, et al. On optimization methods for deep learning[C]//Proceedings of the 28th international conference on international conference on machine learning. 2011: 265-272.
[16] Pennant M E, Bluck L J C, Marcovecchio M L, et al. Insulin administration and rate of glucose appearance in people with type 1 diabetes[J]. Diabetes Care, 2008, 31(11): 2183-2187.
Downloads
Published
Conference Proceedings Volume
Section
License

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.