Preprint
Article

A Two-Step Rule for Backpropagation

Altmetrics

Downloads

281

Views

172

Comments

1

A peer-reviewed article of this preprint also exists.

This version is not peer-reviewed

Submitted:

08 March 2023

Posted:

09 March 2023

You are already at the latest version

Alerts
Abstract
We present a simplified computational rule for the back-propagation formulas for artificial neural networks. In this work, we provide a generic two-step rule for the back-propagation algorithm in matrix notation. Moreover, this rule incorporates both the forward and backward phases of the computations involved in the learning process. Specifically, this recursively computing rule permits the propagation of the changes to all synaptic weights in the network, layer by layer, efficiently. In particular, we use this rule to compute both the up and down partial derivatives of the cost function of all the connections feeding into the output layer.
Keywords: 
Subject: Computer Science and Mathematics  -   Computational Mathematics
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

© 2024 MDPI (Basel, Switzerland) unless otherwise stated