Preprint Brief Report Version 1 This version is not peer-reviewed

Noise as an Optimization Tool for Spiking Neural Networks

Version 1 : Received: 9 August 2024 / Approved: 9 August 2024 / Online: 9 August 2024 (10:28:12 CEST)

How to cite: Garipova, Y.; Yonekura, S.; Kuniyoshi, Y. Noise as an Optimization Tool for Spiking Neural Networks. Preprints 2024, 2024080704. https://doi.org/10.20944/preprints202408.0704.v1 Garipova, Y.; Yonekura, S.; Kuniyoshi, Y. Noise as an Optimization Tool for Spiking Neural Networks. Preprints 2024, 2024080704. https://doi.org/10.20944/preprints202408.0704.v1

Abstract

Standard ANNs lack flexibility when handling corrupted input due to their fixed structure. Spiking Neural Networks (SNNs) can utilize biological temporal coding features, such as noise-induced stochastic resonance and dynamical synapses to increase a model’s performance when its parameters are not optimized for a given input. Using the analog XOR task as a simplified Convolutional Neural Network analog, this paper demonstrates two key results: (1) SNNs solve the problem linearly inseparable in ANN with over 10% improvement in the optimal setting, and (2) in SNNs, the synaptic noise and dynamical synapses compensate for non-optimal parameters, achieving near-optimal results.

Keywords

spiking neural networks; adaptability; noise

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.