Application of Time Series Model LT-MAE in EEG Emotion Recognition

Authors

  • Jianhao Ma

DOI:

https://doi.org/10.62051/ijcsit.v7n2.02

Keywords:

Emotion recognition, Mutli-channel EEG signal, Self-supervised, Time series

Abstract

Electroencephalography (EEG) signals are non-linear and non-stationary. Traditionally, they are segmented into time windows for feature extraction under the assumption of independence and identical distribution, ignoring temporal connections and distribution discrepancies. Additionally, generating high-quality annotations for dynamic emotions is labor-intensive and time-consuming. To address these issues, we propose LT-MAE, a self-supervised learning model. It segments EEG signals into continuous time steps and uses long short-term memory network (LSTM) to learn context representations across channels. Emotion distributions are learned using an enhanced mask autoencoder with classification and reconstruction tasks. This approach assesses emotional changes over continuous time steps to determine long-term emotional inclinations. Experiments on SEED-IV and DEAP datasets show that LT-MAE learns a broader time-step emotion distribution in the coding space, improving emotion detection accuracy and mitigating labeling inaccuracies due to finite-time granularity. Unlike traditional methods that assume independent and identically distributed data, LT-MAE captures temporal connections and distribution discrepancies. By leveraging LSTM and enhanced mask autoencoder techniques, it provides more accurate emotion recognition while reducing reliance on costly annotations. In conclusion, LT-MAE offers an effective solution to limitations in conventional EEG feature extraction. Using self-supervised learning, it enhances the understanding of temporal sentiments and improves emotion recognition accuracy, addressing challenges related to annotation quality and granularity.

Downloads

Download data is not yet available.

References

[1] Lin Shu, Jinyan Xie, Mingyue Yang, Ziyi Li, Zhenqi Li, Dan Liao, Xiangmin Xu, and Xinyi Yang. A review of emotion recognition using physiological signals. Sensors, 18(7):2074, 2018.

[2] Silvia De Nadai, Massimo D'Incà, Francesco Parodi, Mauro Benza, Anita Trotta, Enrico Zero, Luca Zero, and Roberto Sacile. Enhancing safety of transport by road by on-line monitoring of driver emotions. In 2016 11th System of Systems Engineering Conference (SoSE), pages 1--4, 2016.

[3] Rui Guo, Shuangjiang Li, Li He, Wei Gao, Hairong Qi, and Gina Owens. Pervasive and unobtrusive emotion sensing for human mental health. In 2013 7th International Conference on Pervasive Computing Technologies for Healthcare and Workshops, pages 436--439, 2013.

[4] Bruno Verschuere, Geert Crombez, Ernst Koster, and Katarzyna Uzieblo. Psychopathy and physiological detection of concealed information: A review. Psychologica Belgica, 46(1-2), 2006.

[5] Jing Cai, Ruolan Xiao, Wenjie Cui, Shang Zhang, and Guangda Liu. Application of electroencephalography-based machine learning in emotion recognition: A review. Frontiers in Systems Neuroscience, 15:729707, 2021.

[6] Shuaiqi Liu, Zeyao Wang, Yanling An, Jie Zhao, Yingying Zhao, and Yu-Dong Zhang. EEG emotion recognition based on the attention mechanism and pre-trained convolution capsule network. Knowledge-Based Systems, 265:110372, 2023.

[7] Liumei Zhang, Bowen Xia, Yichuan Wang, Wei Zhang, and Yu Han. A Fine-Grained Approach for EEG-Based Emotion Recognition Using Clustering and Hybrid Deep Neural Networks. Electronics, 12(23):4717, 2023.

[8] Ziyi Lv, Jing Zhang, and Estanislao Epota Oma. A Novel Method of Emotion Recognition from Multi-Band EEG Topology Maps Based on ERENet. Applied Sciences, 12(20):10273, 2022.

[9] Deng Pan, Haohao Zheng, Feifan Xu, Yu Ouyang, Zhe Jia, Chu Wang, and Hong Zeng. MSFR-GCN: A multi-scale feature reconstruction graph convolutional network for EEG emotion and cognition recognition. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2023.

[10] Yan Wu, Tianyu Meng, Qi Li, Yang Xi, and Hang Zhang. Study on multidimensional emotion recognition fusing dynamic brain network features in EEG signals. Biomedical Signal Processing and Control, 100:107054, 2025.

[11] Liwen Cao, Wenfeng Zhao, and Biao Sun. Emotion recognition using multi-scale EEG features through graph convolutional attention network. Neural Networks, 184:107060, 2025.

[12] Paul Ekman, Wallace V Friesen, Maureen O'sullivan, Anthony Chan, Irene Diacoyanni-Tarlatzis, Karl Heider, Rainer Krause, William Ayhan LeCompte, Tom Pitcairn, Pio E Ricci-Bitti, et al. Universals and cultural differences in the judgments of facial expressions of emotion. Journal of Personality and Social Psychology, 53(4):712, 1987.

[13] Robert Plutchik. The nature of emotions: Human emotions have deep evolutionary roots, a fact that may explain their complexity and provide tools for clinical practice. American Scientist, 89(4):344--350, 2001.

[14] Peter J Lang. The emotion probe: Studies of motivation and attention. American Psychologist, 50(5):372, 1995.

[15] Albert Mehrabian. Comparison of the PAD and PANAS as models for describing emotions and for differentiating anxiety from depression. Journal of Psychopathology and Behavioral Assessment, 19:331--357, 1997.

Downloads

Published

27-09-2025

Issue

Section

Articles

How to Cite

Ma, J. (2025). Application of Time Series Model LT-MAE in EEG Emotion Recognition. International Journal of Computer Science and Information Technology, 7(2), 18-30. https://doi.org/10.62051/ijcsit.v7n2.02