Sequencing Initial Conditions in Physics-Informed Neural Networks
DOI:
https://doi.org/10.56946/jce.v3i1.345Keywords:
Scientific machine learning PINN, soft-regularization, multiphysics modeling, chemical engineering PDEsAbstract
The scientific machine learning (SciML) field has introduced a new class of models called physics-informed neural networks (PINNs). These models incorporate domain-specific knowledge as soft constraints on a loss function and use machine learning techniques to train the model. Although PINN models have shown promising results for simple problems, they are prone to failure when moderate level of complexities are added to the problems. We demonstrate that the existing baseline models, in particular PINN and evolutionary sampling (Evo), are unable to capture the solution to differential equations with convection, reaction, and diffusion operators when the imposed initial condition is non-trivial. We then propose a promising solution to address these types of failure modes. This approach involves coupling Curriculum learning with the baseline models, where the network first trains on PDEs with simple initial conditions and is progressively exposed to more complex initial conditions. Our results show that we can reduce the error by 1 – 2 orders of magnitude with our proposed method compared to regular PINN and Evo.
Downloads
Published
How to Cite
Issue
Section
License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.