Preprint Article Version 1 This version is not peer-reviewed

Power-Based Normalization of Loss Terms to Improve the Performance of Physics-Informed Neural Networks (PINNs)

Version 1 : Received: 7 August 2024 / Approved: 7 August 2024 / Online: 7 August 2024 (13:34:45 CEST)

How to cite: Tongne, A. Power-Based Normalization of Loss Terms to Improve the Performance of Physics-Informed Neural Networks (PINNs). Preprints 2024, 2024080528. https://doi.org/10.20944/preprints202408.0528.v1 Tongne, A. Power-Based Normalization of Loss Terms to Improve the Performance of Physics-Informed Neural Networks (PINNs). Preprints 2024, 2024080528. https://doi.org/10.20944/preprints202408.0528.v1

Abstract

A novel approach is developed to improve the convergence of Physics-Informed Neural Networks (PINNs), aiming to employ them as real-time computational models within the framework of the digital twin for manufacturing processes. This method entails the weighting of physical equations, boundary conditions, and initial conditions to ensure their comparable magnitudes, with power being the chosen quantity in this study. The approach is applied to thermal problems, which are crucial for predicting manufacturing part defects. Different configurations, including complex boundary conditions and complex physics, were tested to assess the model’s robustness. The W-PINN demonstrates good predictions and strong stability compared to the classical PINN.

Keywords

Physics Informed; Neural Networks; Heat transfer; PINN; Advection

Subject

Engineering, Mechanical Engineering

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.