Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Distributed computing support (hundreds of processors) #18

Open
ali-ramadhan opened this issue Mar 9, 2019 · 1 comment
Open

Distributed computing support (hundreds of processors) #18

ali-ramadhan opened this issue Mar 9, 2019 · 1 comment
Assignees

Comments

@ali-ramadhan
Copy link
Owner

ali-ramadhan commented Mar 9, 2019

Right now joblib does just fine for multi-core parallelization but for distributed computing might be good to check out Dask: https://docs.dask.org/en/latest/

We'll probably need this if we want to do >10,000,000 particles.

@ali-ramadhan ali-ramadhan self-assigned this Mar 9, 2019
@ali-ramadhan
Copy link
Owner Author

Will also need something like Dask for when the datasets can't fit in memory anymore. Easy for us to split datasets but might be nicer to switch to Dask?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant