Preprint Article Version 1 This version is not peer-reviewed

Transforming Neural Networks on Manifold and Renormalization

Version 1 : Received: 12 July 2024 / Approved: 12 July 2024 / Online: 15 July 2024 (07:18:15 CEST)

How to cite: Garayev, G. Transforming Neural Networks on Manifold and Renormalization. Preprints 2024, 2024071090. https://doi.org/10.20944/preprints202407.1090.v1 Garayev, G. Transforming Neural Networks on Manifold and Renormalization. Preprints 2024, 2024071090. https://doi.org/10.20944/preprints202407.1090.v1

Abstract

This paper presents an enhanced Fully Connected Neural Network (FCNN) framework that incorporates manifold learning and Renormalization Group (RG) methods. By integrating Christoffel symbols, the model adapts to the curvature and topology of the data manifold, improving learning from geometrically complex data. Additionally, RG methods enable effective multi-scale data processing. We demonstrate the enhanced model’s performance on synthetic and real-world datasets, showing improved accuracy compared to standard FCNNs, particularly in complex systems simulation.

Keywords

differential geometry; multi-scale representation; AI; loss functions

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.