This work describes and compares the backpropagation algorithm with the Extended Kalman filter, a second-order training method which can be applied to the problem of learning neural network parameters and is known to converge in only a few iterations. The algorithms are compared with respect to their effectiveness and speed of convergence using simulated data for both, a regression and a classification task.
Keywords:
Subject: Computer Science and Mathematics - Data Structures, Algorithms and Complexity
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.