Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adaptive proposals in Turing composite Gibbs sampler #44

Closed
arzwa opened this issue Dec 8, 2020 · 8 comments
Closed

Adaptive proposals in Turing composite Gibbs sampler #44

arzwa opened this issue Dec 8, 2020 · 8 comments

Comments

@arzwa
Copy link

arzwa commented Dec 8, 2020

Hi there, a question related to this PR

I'm currently facing an application where I would really like to use adaptive proposals like those defined in this PR in a Metropolis-within-Gibbs setting (i.e. we have a parameter vector x, for each parameter have an adaptive univariate proposal, and in each iteration of the MCMC sampler we update each component of the parameter vector conditional on the others using a Metropolis-Hastings step). The Turing-way to go would seem to use the stuff implemented in AdvancedMH in a Turing composite Gibbs sampler (something roughly like Gibbs(AdaptiveMH(:p1), AdaptiveMH(:p2), ...) where the p1, p2, ... are the parameter vector components)? I think in general this is worthwhile for low-dimensional applications where the gradient of the loglikelihood is really costly or unavailable. I wonder what would be the best way to proceed to allow this? Thanks for any hints!

@cpfiffer
Copy link
Member

cpfiffer commented Dec 8, 2020

I've got a couple of thoughts here:

  1. It's not clear to me that any Gibbs stuff should live in AdvancedMH -- it's kind of a scope overreach. That said, we do have emcee in here, which isn't quite MH but does use an MH step, as many algorithms do. I would imagine that a better solution is AdvancedGibbs or something, which would be Turing-free Gibbs sampling using the AbstractMCMC interface.
  2. Is there a reason you can't use Turing? Do you already have the likelihood function written up (a common case, just asking)?

If we were to support it, I don't think it would be easy, since you'd have to be a little more specific with how all the proposals are generated. The issue is that AdvancedMH supports all kinds of proposal styles, and it doesn't actually know what counts as a "parameter" or have any such notion of incrementing individual parameters. For example, if all you're doing is sampling from an n vector with a multivariate normal proposal, you know you can just go through each element. It's far less clear how to do this with NamedTuple proposals (for instance), but I suspect there might be an easy recursive solution here.

@arzwa
Copy link
Author

arzwa commented Dec 8, 2020

Hmm, either I don't understand, or you misunderstood my question. I am exactly interested in using Turing and the Gibbs sampler implemented there. My question is rather on how I would be able (or what I should implement) to use the MH sampler with an adaptive proposal as a Gibbs component with a Turing model?

@arzwa
Copy link
Author

arzwa commented Dec 8, 2020

OK, I had some trouble finding the right docs in Turing, it seems that when that PR is merged, it should be possible to do something like the following right?

sampler = Gibbs(
    MH(:v1=> AdvancedMH.AdaptiveProposal()), 
    MH(:v2=> AdvancedMH.AdaptiveProposal()))
chain = sample(model, sampler, 1000)

which is what I have in mind, hope that makes it clearer.

@cpfiffer
Copy link
Member

cpfiffer commented Dec 9, 2020

Ah, I understand, sorry.

There'll need to be some changes on Turing's side before that will work for variables that are in a constrained space (not -Inf to Inf), but it should function reasonably well right off the bat if v1 and v2 are unconstrained and continuous.

@fipelle
Copy link

fipelle commented Oct 25, 2022

What is the status on adaptive metropolis proposals? Happy to help if needed

@cpfiffer
Copy link
Member

As far as I am aware, there's not any active development. Would love a push to get it in! There's two PRs for adaptive methods (#39 and #57). Both are really great but just need someone to revisit them and do the wrap up.

@fipelle
Copy link

fipelle commented Oct 25, 2022

@cpfiffer: thanks for the reply! I will take a look. I am working on an ABC implementation of Turing and it would be handy to have it.

@cpfiffer
Copy link
Member

Awesome! Let me know if you need any pointers.

@yebai yebai closed this as not planned Won't fix, can't repro, duplicate, stale Jun 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants