Article
Version 1
This version is not peer-reviewed
Transforming Neural Networks on Manifold and Renormalization
Version 1
: Received: 12 July 2024 / Approved: 12 July 2024 / Online: 15 July 2024 (07:18:15 CEST)
How to cite: Garayev, G. Transforming Neural Networks on Manifold and Renormalization. Preprints 2024, 2024071090. https://doi.org/10.20944/preprints202407.1090.v1 Garayev, G. Transforming Neural Networks on Manifold and Renormalization. Preprints 2024, 2024071090. https://doi.org/10.20944/preprints202407.1090.v1
Abstract
This paper presents an enhanced Fully Connected Neural Network (FCNN) framework that incorporates manifold learning and Renormalization Group (RG) methods. By integrating Christoffel symbols, the model adapts to the curvature and topology of the data manifold, improving learning from geometrically complex data. Additionally, RG methods enable effective multi-scale data processing. We demonstrate the enhanced model’s performance on synthetic and real-world datasets, showing improved accuracy compared to standard FCNNs, particularly in complex systems simulation.
Keywords
differential geometry; multi-scale representation; AI; loss functions
Subject
Computer Science and Mathematics, Artificial Intelligence and Machine Learning
Copyright: This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Comments (0)
We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.
Leave a public commentSend a private comment to the author(s)
* All users must log in before leaving a comment