-
Notifications
You must be signed in to change notification settings - Fork 74
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding Regularization to the PINN Loss function #102
Comments
👋🏻 @yorkiva Thank you for your comment. The Eventually, I agree with you that a class |
Hello @yorkiva :) Just wanted to let you know that in the beta version we will soon release the possibility to add a custom loss for the PINN (#105), and other very cool features such as gradient clipping, batch gradient accumulation, ... since we will use lightining Thank you for the very useful feedbacks 😄 |
Has this been resolved? |
The regularizer in the PINN class only implements the L2 regularization. It should also be extended to incorporate L1 regularization. I have had some experience with comparing these regularizations and I have found that for some problems L1 regularization can be quite useful when the training data (i.e. boundary/initial condition) is noisy. |
Is your feature request related to a problem? Please describe.
The condition module is the only place where you can define your physics-informed loss functions. However, there seems to be no way of adding additional regularizers like a L1 or L2 regularizer on the model's weights. Such regularization can be very useful to constrain overfitting, especially if training with noisy IC.
Describe the solution you'd like
It would be good to have an extension of the condition module (or something similar) to allow regularization.
Describe alternatives you've considered
Additional context
openjournals/joss-reviews#5352
The text was updated successfully, but these errors were encountered: