You are currently viewing a beta version of our website. If you spot anything unusual, kindly let us know.

Preprint
Article

Neural Network Under External Stimulus: Improving Storage Capacity and Reactions

Altmetrics

Downloads

233

Views

131

Comments

0

A peer-reviewed article of this preprint also exists.

This version is not peer-reviewed

Submitted:

06 August 2020

Posted:

07 August 2020

You are already at the latest version

Alerts
Abstract
Motivated by the way animals react in nature, we introduce a fully-connected neural-network model, which incorporates the external pattern presented to the system as a fundamental tool of the recognition process. In this new scenario, in the absence of this external field, memories are not attractors inside basins of attraction, as in usual attractor neural networks, although basins may be created according to the external pattern, thus allowing storing a much larger number of memories. The key point consists in calibrating the influence of the external pattern such as to cancel the noise generated by those memories not correlated with the external pattern.We illustrate how this proposal works by including this new contribution in the standard Hopfield model, showing a significant increase in its recognition capacity (typically by a factor $10^{2}$). As an additional feature of this model, we show its ability to react promptly to changes in the external environment, a crucial attribute of living beings. This procedure can be applied to a wide variety of neural-network models to increase considerably their recognition and reaction capabilities.
Keywords: 
Subject: Physical Sciences  -   Applied Physics
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

© 2024 MDPI (Basel, Switzerland) unless otherwise stated