Wearable Activity Recognition for Advancing Elderly Care with Modified Transformer Model
DOI:
https://doi.org/10.62051/ijcsit.v4n2.30Keywords:
Transformer, Long Short-Term Memory (LSTM), Wearable activity recognition, Deep learning, Elderly careAbstract
Technological devices such as smartphones can be utilized in tracking various human activities or movements through built-in accelerometers and gyroscopes. Data obtained from these inertial sensors can be utilized in various applications to assist people, including healthcare, human-computer interaction, and sports. As a result, further developments in the effective classification of time series data through machine and deep learning is highly valued and actively pursued. In this study, the transformer model, a deep learning architecture designed for sequential data such as natural language processing (NLP), has been utilized for analysis of time-series motion readings from wearable accelerometers. The transformer model in this study has been refined by incorporating a Long Short-Term Memory (LSTM) recurrent neural network (RNN) architecture. By leveraging the HAR70+ dataset with a wide range of activities, the modified transformer model in this study obtained a best accuracy of 95.85%, demonstrating that it can match the performance of state-of-the-art wearable activity recognition methods using Deep Neural Networks (DNN) and LSTM. Hence, the findings presented in this study suggest the future relevance of improved transformer or deep learning models to enhance the quality of life for seniors.
Downloads
References
[1] Ustad, A., Logacjov, A., Trollebø, S.Ø., Thingstad, P., Vereijken, B., Bach, K., Maroni, N.S., 2023. Validation of an Activity Type Recognition Model Classifying Daily Physical Behavior in Older Adults: The HAR70+ Model. Sensors 23, 2368. https://doi.org/10.3390/s23052368
[2] S. Abbas et al., "Advancing Healthcare and Elderly Activity Recognition: Active Machine and Deep Learning for Fine- Grained Heterogeneity Activity Recognition," in IEEE Access, vol. 12, pp. 44949-44959, 2024, doi: 10.1109/ACCESS.2024.3380432. keywords: {Older adults; Legged locomotion; Medical services; Human activity recognition; Adaptation models; Feature extraction; Data models; Deep learning; Machine learning; Life testing; Active learning (AL); elderly activity recognition; human activity recognition (HAR); healthcare; deep learning (DL);machine learning (ML); lifestyle and technology}
[3] Dirgová Luptáková, I.; Kubovčík, M.; Pospíchal, J. Wearable Sensor-Based Human Activity Recognition with Transformer Model. Sensors 2022, 22, 1911. https://doi.org/10.3390/s22051911
[4] Reza, S., Ferreira, M. C., Machado, J. J. M., & Tavares, J. M. R. S. (2022). A multi-head attention-based transformer model for traffic flow forecasting with a comparative analysis to recurrent neural networks. Expert Systems with Applications, 202, 117275. https://doi.org/10.1016/j.eswa.2022.117275
[5] Pareek, G., Nigam, S. & Singh, R. Modeling transformer architecture with attention layer for human activity recognition. Neural Comput & Applic 36, 5515–5528 (2024). https://doi.org/10.1007/s00521-023-09362-7
[6] Teixeira, E., Fonseca, H., Diniz-Sousa, F., Veras, L., Boppre, G., Oliveira, J., Pinto, D., Alves, A. J., Barbosa, A., Mendes, R., & Marques-Aleixo, I. (2021). Wearable Devices for Physical Activity and Healthcare Monitoring in Elderly People: A Critical Review. Geriatrics, 6(2), 38. https://doi.org/10.3390/geriatrics6020038
[7] Yadav, S. K., Tiwari, K., Pandey, H. M., & Akbar, S. A. (2021). A review of multimodal human activity recognition with special emphasis on classification, applications, challenges and future directions. Knowledge-Based Systems, 223, 106970. https://doi.org/10.1016/j.knosys.2021.106970
[8] Taylor, K., Abdulla, U. A., Helmer, R. J. N., Lee, J., & Blanchonette, I. (2011). Activity classification with smart phones for sports activities. Procedia Engineering, 13, 428–433. https://doi.org/10.1016/j.proeng.2011.05.109
[9] Z. Song et al., "Attention-Oriented Action Recognition for Real- Time Human-Robot Interaction," 2020 25th International Conference on Pattern Recognition (ICPR), Milan, Italy, 2021, pp. 7087-7094, doi: 10.1109/ICPR48806.2021.9412346. keywords: {Semantics;Pose estimation;Human-robot interaction;Real-time systems;Skeleton;Mobile robots;Data mining},
[10] M. A. Khatun et al., "Deep CNN-LSTM With Self-Attention Model for Human Activity Recognition Using Wearable Sensor," in IEEE Journal of Translational Engineering in Health and Medicine, vol. 10, pp. 1-16, 2022, Art no. 2700316, doi: 10.1109/JTEHM.2022.3177710. keywords: {Sensors;Smart phones;Activity recognition;Featureextraction;Deeplearning;Wearablesensors;Gyroscopes;Sensors;smartphones;accelerometers;attention;gyroscopes;LSTM},
[11] Nair, N., Thomas, C., & Jayagopi, D. B. (2018). Human Activity Recognition Using Temporal Convolutional Network. Proceedings of the 5th International Workshop on Sensor-Based Activity Recognition and Interaction. https://doi.org/10.1145/3266157.3266221
[12] Liu L, He J, Ren K, Lungu J, Hou Y, Dong R. An Information Gain-Based Model and an Attention-Based RNN for Wearable Human Activity Recognition. Entropy. 2021; 23(12):1635. https://doi.org/10.3390/e23121635
[13] Yanhui, C. (2021, March 8). A Battle Against Amnesia: A Brief History and Introduction of Recurrent Neural Networks. Medium. https://towardsdatascience.com/a-battle-against-amnesia-a-brief-history-and-introduction-of-recurrent-neural-networks-50496aae6740
[14] A. Logacjov, K. Bach, A. Kongsvold, H. B. Bårdstu, and P. J. Mork, “HARTH: A Human Activity Recognition Dataset for Machine Learning,” Sensors, vol. 21, no. 23, Art. no. 23, Jan. 2021, doi: 10.3390/s21237853.
[15] K. Bach et al., “A Machine Learning Classifier for Detection of Physical Activity Types and Postures During Free-Living,” Journal for the Measurement of Physical Behaviour, vol. 1, no. aop, pp. 1–8, Dec. 2021, doi: 10.1123/jmpb.2021-0015.
[16] N. Sikder and A.-A. Nahid, “KU-HAR: An open dataset for heterogeneous human activity recognition,” Pattern Recognition Letters, vol. 146, pp. 46–54, Jun. 2021, doi: 10.1016/j.patrec.2021.02.024
[17] A.-A. Nahid, N. Sikder, and I. Rafi, “KU-HAR: An Open Dataset for Human Activity Recognition.” Mendeley, Feb. 16, 2021, doi: 10.17632/45F952Y38R.5
[18] PeMS Data Source | Caltrans. (n.d.). Dot.ca.gov. https://dot.ca.gov/programs/traffic-operations/mpr/pems-source
[19] Yu S., Tan D., Tan T. A framework for evaluating the effect of view angle, clothing and carrying condition on gait recognition. in Pattern Recognition, 2006. ICPR 2006. 18th International Conference on; 2006; Hong Kong, China. pp. 441–444.
[20] Khurram Soomro, Amir Roshan Zamir and Mubarak Shah, UCF101: A Dataset of 101 Human Action Classes From Videos in The Wild., CRCV-TR-12-01, November, 2012.
[21] Lin, T., Wang, Y., Liu, X., & Qiu, X. (2022). A survey of transformers. AI Open, 3. https://doi.org/10.1016/j.aiopen.2022.10.001
[22] Enes Koşar, & Billur Barshan. (2023). A new CNN-LSTM architecture for activity recognition employing wearable motion sensor data: Enabling diverse feature extraction. Engineering Applications of Artificial Intelligence, 124, 106529–106529. https://doi.org/10.1016/j.engappai.2023.106529
[23] Ankita, Rani S, Babbar H, Coleman S, Singh A, Aljahdali HM. An Efficient and Lightweight Deep Learning Model for Human Activity Recognition Using Smartphones. Sensors. 2021; 21(11):3845. https://doi.org/10.3390/s21113845
Downloads
Published
Issue
Section
License
Copyright (c) 2024 International Journal of Computer Science and Information Technology

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.