Article
Version 1
Preserved in Portico This version is not peer-reviewed
Intriguing Self-optimization: Gradient Sign and Fractional Order Gradient
Version 1
: Received: 19 May 2023 / Approved: 23 May 2023 / Online: 23 May 2023 (08:11:44 CEST)
How to cite: Tan, S.; Pu, Y. Intriguing Self-optimization: Gradient Sign and Fractional Order Gradient. Preprints 2023, 2023051614. https://doi.org/10.20944/preprints202305.1614.v1 Tan, S.; Pu, Y. Intriguing Self-optimization: Gradient Sign and Fractional Order Gradient. Preprints 2023, 2023051614. https://doi.org/10.20944/preprints202305.1614.v1
Abstract
The gradient descent algorithm has become the standard algorithm for computing the extreme values of functions, but for multivariate functions this algorithm is mostly ineffective. This is because the convergence rate of each element is inconsistent in most cases. However, in this paper, we found that the gradient sign is a self-optimizing operator, ensuring that the convergence rate is consistent across all elements. This also explains, from an optimization perspective, the success of the Fast Gradient Sign Method (FGSM) in generating adversarial samples that are indistinguishable from the normal input, but can easily fool neural networks. We also found that the fractional order gradient is also self-optimizing, and that the convergence speed of this algorithm can be controlled by adjusting the order of the gradient. Experiments suggest that this algorithm not only generates adversarial samples faster than other algorithms, but that a single source image can generate many such samples. This algorithm is also more effective than others at generating adversarial samples from simple images.
Keywords
Caputo fractional derivative; fractional order gradient; gradient sign; adversarial samples; DNNs
Subject
Computer Science and Mathematics, Artificial Intelligence and Machine Learning
Copyright: This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Comments (0)
We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.
Leave a public commentSend a private comment to the author(s)
* All users must log in before leaving a comment