Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multithreading in Emulation/MCMC #200

Open
odunbar opened this issue Jan 18, 2023 · 2 comments
Open

Multithreading in Emulation/MCMC #200

odunbar opened this issue Jan 18, 2023 · 2 comments
Assignees
Labels
enhancement New feature or request

Comments

@odunbar
Copy link
Collaborator

odunbar commented Jan 18, 2023

Issue

Easy gains in MCMC, by using multithreading within each step (and calling e.g. julia --project -t 8 script.jl) . For GP (and scalar RF) implementations,

  • the prediction runs a loop over the scalar-valued models.
  • the training stage also runs a loop over the scalar-valued models. (Here it may require extra memory management)

Suggestion

  1. For MCMC, add the decorator Threads.@threads for i=1:M to the loop

    for i in 1:M
    μ[i, :], σ2[i, :] = predict_method(gp.models[i], new_inputs)
    end

    This will increase speed of prediction within MCMC by e.g. 8x

  2. For decorrelated problems, (i.e. GP and scalar RF) one can similarly train the models with such loop decorations.
    This will increase the speed of training by e.g. 8x

@odunbar odunbar added the enhancement New feature or request label Jan 18, 2023
@odunbar
Copy link
Collaborator Author

odunbar commented Jan 23, 2023

Preliminarily from @szy21 we see that 8 threads gives only 2x speed-up to sampling in the EDMF example, I'll continue the investigation with other examples.

@odunbar
Copy link
Collaborator Author

odunbar commented Apr 4, 2023

Oftentimes, the downstream dependencies will greedily harness all available threads, thus calling with -t 8 and not putting in any code changes (e.g. dont add the Threads.@threads) often gives significant speedup.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants