Research and Implementation of a Facial Emotion Recognition System Based on ZYNQ
DOI:
https://doi.org/10.62051/ijcsit.v3n2.31Keywords:
Emotional recognition, ZYNQ, Face detection, Neural networksAbstract
Emotion is the psychological and physiological response of a person or animal to a specific stimulus or event, and it can also have an impact on cognition, decision-making, and behavior. Research on emotion recognition has wide applications in various fields, such as dialogue and communication with machines, and in the medical field. At present, most emotion recognition systems are based on PC or embedded platforms. Compared with traditional emotion recognition platforms, FPGA has good parallel computing ability and can process multiple data streams simultaneously, making it suitable for complex pattern recognition in emotion recognition. This article chooses ZYNQ as the development platform, combined with the collaborative ability of FPGA and ARM processors, to build a complete hardware application system that can carry neural networks. Secondly, in response to the problems of complex computation and high storage resource consumption in existing neural network computing frameworks, this paper adopts a lightweight neural network MobileNetV3 and implements and improves it. The model is encapsulated and ported using the Vivado HLS development tool and C++. The hardware IP core is called on the hardware platform to achieve ZYNQ based facial emotion classification and recognition, and the power consumption of hardware resources is analyzed.
Downloads
References
Alexander Kroh, Oliver Diessel. Efficient Fine-grained Processor-logic Interactions onthe Cache-coherent Zynq Pla tform [J]. ACM Transactions on Reconfiguable Technologyand Systems(TRETS), 2019, 11 (4).
Yong yang Zou, Ming Chen, Kanglin Wei. Design of Custom AXI4 IP Based on AXI4Pro tocol [J]. Applied Mechanics and Materials, 2014, 3634.
Chen S, Hsu C, Kuo C, et al. EmotionLines: an Emotion corpus of multi-party conversations [J]. arXiv preprient. 2018, ariv:1802-8379.
Busso C, Bulut M, Lee C, et al. IEMOCAP: interactive emotional dyadic motion capture database [J]. Language Res: urces and Evaluation. 2008, 42:335-359.
LaBar KS, Cabeza R. Cognitive neuroscience of emotional memory [J]. Nature Reviews Neuroscience, 2018, 7(1):54-64
Lehrer S F, Xie T. The bigger picture. Combining eoconomericos with analytics improves forecasts of movie success J Management Science, 2022, 68(1:189-210.
Abouelela, A. Abbas, H.M., Eldeeb, H., Wahdan, A.A. Nassar, S.M.: Automated vision system for localizing structural defects in textile fabrics. Pattern recognition, 2019:64-73
Xinyu Li, Guangshun Wei, Jie Wang, et al. Muli-scale joint feature network for micro-expression recognition [J]. Computatonal Visual Media, 2021, 7(3):407-417.
Cerisara C, Jafaritazehjani S, Oluokun A, et al. Multi-task dialog act and sentiment recognition on mastodon [C] IPrcceedings of the 27th International Conference on Computational Linguistics. Santa Fe, PA:ACL, 2018 :745-754.
zahis, Choi J. Emotion detecion on T7l showltranscripts with sequence-based convolutional neural networks J. arXiv preprient. 2019, arXiv:1708-4299.
Citeje, cupla U, Chnmakot Ml K, et al Undertanding emotions in fext using deep leaming and big data [J]. computers in Human Behavior. 2019, 93∶309-317.
Chatteje A, NarahariKN, Jonshi M et al SemEval-2019 Task 3: emocontext conextual emotion delection in text [C] JIProceedings of the 13th International Workshop on Semantic Evaluation. Minneapolis, PA:ACL, 2019:39-48.
Micewn G, last r M. Ccowie R, t al The SEMAINE Databese: anmotated mutimnodal recoris of emotionaly colored conversations between a person and a limited agent [J]. IEEE Transactions on Affective Computing. 2019, 3(1):5-17.
Poia S, Hazarika D, Majunder N t al MELD: a mutimodal murt-party dataset for emotion recogrition in conversationsS [C] Proceedings of the 57th AnualMeeting of the Association for Computational Linguistics. Florence, PA:ACL, 2019 :527-536.
Saha T, Patra A, Saha S, et al bowerds emotion-aided muli-modal dialogue act clssfitationC]JIProcedings of the 5ith Anul Meeting of theAssociation for Computational Linguistics.Online, PA: ACL, 2020: 4361-4372.
Dou, Pengyu, and John Morris. "Low-Power FPGA Design for Real-Time Facial Emotion Recognition." 2019 IEEE International Symposium on Smart Electronic Systems (iSES). IEEE, 2019.
A. J. Atoum and M. H. A. Jaafar, "Real-Time Facial Expression Recognition on FPGA," in IEEE Access, vol. 6, pp. 66225-66237, 2018.
Zhang, S., Wang, H., & Yang, F. (2016). Real-time facial expression recognition based on ZYNQ. In 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (pp. 1333-1337). IEEE.
Zhang, B., Huang, Y., & Zhang, L. (2018). An efficient FPGA-based system for real-time facial expression recognition. Journal of Real-Time Image Processing, 15(4), 813-826.
Hu, J., Chen, X., & Peng, S. (2019). Implementation of real-time facial emotion recognition system on ZYNQ SoC. In Proceedings of the 2019 3rd International Conference on Electronics Engineering and Informatics (ICEEI) (pp. 185-189). ACM.
Downloads
Published
Issue
Section
License

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.







