Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Batching #33

Closed
ndem0 opened this issue Nov 22, 2022 · 0 comments · Fixed by #51
Closed

Batching #33

ndem0 opened this issue Nov 22, 2022 · 0 comments · Fixed by #51
Labels
enhancement New feature or request

Comments

@ndem0
Copy link
Member

ndem0 commented Nov 22, 2022

Is your feature request related to a problem? Please describe.
Add the possibility of using the batching within a PINN train.

Describe the solution you'd like
A new argument in the constructor to select the batch size and using it in the optimization cycle.

Additional context
Mandatory for long train, especially using GPU

@ndem0 ndem0 added the enhancement New feature or request label Nov 22, 2022
@ndem0 ndem0 assigned ndem0 and unassigned ndem0 Nov 29, 2022
@ndem0 ndem0 linked a pull request Dec 6, 2022 that will close this issue
@ndem0 ndem0 closed this as completed in #51 Dec 12, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant