Preprint Article Version 1 This version is not peer-reviewed

A Lightweight Shallow Convolutional SNN Combined with STDP Fine-Tuning for Facial Expression Recognition

Version 1 : Received: 25 July 2024 / Approved: 26 July 2024 / Online: 26 July 2024 (13:22:17 CEST)

How to cite: Pu, G.; Chen, J.; Wang, R.; Tang, Z. A Lightweight Shallow Convolutional SNN Combined with STDP Fine-Tuning for Facial Expression Recognition. Preprints 2024, 2024072153. https://doi.org/10.20944/preprints202407.2153.v1 Pu, G.; Chen, J.; Wang, R.; Tang, Z. A Lightweight Shallow Convolutional SNN Combined with STDP Fine-Tuning for Facial Expression Recognition. Preprints 2024, 2024072153. https://doi.org/10.20944/preprints202407.2153.v1

Abstract

Accurate and robust deep learning models for facial expression recognition are challenging to achieve, given the diversity of human faces and variations in images, including different facial poses and lighting conditions. In this work, we proposed a clock-driven convolutional Spiking Neural Network (SNN) and STDP fine-tune architecture, meticulously calibrated its hyperparameters, and experimented with various optimization methods. The best model resulted was trained and evaluated on the Fer2013 and FER+ database, obtaining an accuracy of 61.87% and 79.97% without requiring auxiliary training data or face registration. To our best knowledge, the proposed SNN achieved comparable accuracy to CNNs of similar depth and possessed the advantages of low energy consumption and high computational efficiency. The computational efficiency of the proposed SNN is approximately three times that of CNNs. Along with this, we introduced the very recent cumulative spike guided encoder visualization technique and revealed the strong encoding capability of the proposed SNN.

Keywords

convolutional spiking neural network; STDP fine-tune; facial expression recognition; sparsity; computational efficiency

Subject

Computer Science and Mathematics, Computer Vision and Graphics

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.