-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Shaping ApproxInferenceBase.jl #1
Comments
I've been pointed to this issue by @jbrea due to the fact that I'm currently trying to integrate some simple ABC methods into Turing.jl so that it can be used as a drop-in replacement when MCMC methods are impractical to use. Have you guys had a look at AbstractMCMC.jl? At the time of writing, I cannot claim to have a good overview of the ABC-literature, but from what I can tell it seems like ABC methods would fit nicely into the AbstractMCMC.jl "interface"/framework. The docs are here: https://turing.ml/dev/docs/for-developers/interface. The one thing that might not fit as nicely is when the transitions are bundled into a The way I've currently done it in TuringLang/Turing.jl#1334 for simple ABC methods, e.g. random-walk, is to overload the
Essentially, you'd just define your own subtype of I'd love to hear what you guys think!:) (Also, if this is not suitable for ABC I'd love to hear that too! That would save me from implementing it in Turing only to realize that this approach will be insufficient later on!) |
Hi @torfjelde. Thanks a lot for your input! I definitely want to have a look at AbstractMCMC.jl for the sampling based ABC methods. Your implementation for non-adaptive acceptance thresholds looks neat! Did you also think about adding to Turing.jl alternative ABC methods like KernelABC? To avoid too much duplicate work, it may also make sense at some point to think about where the core ABC methods should be: in this org or in Turing.jl? If they stay in this org, I think it would just make a lot of sense to have them such that they are very easy to integrate with Turing.jl. |
I looked a little bit more into AbstractMCMC.jl. It seems that the few ABC-MCMC methods (e.g. Marjoram et al. 2003) can nicely be fit in this framework. But I don't really see how the ABC-SMC (Sequential Monte Carlo) methods fit in (e.g. Beaumont et al. 2009, Del Moral et al. 2011, Drovandi and Pettit 2011, Turner and Sederberg 2012). These methods operate iteratlvely on a set of particles. AFAIK they are considered superior to the ABC-MCMC methods. Chapter 4 of the Handbook of Approximate Bayesian Computation (edited by Sisson, Fan and Beaumont) gives a nice overview. I guess it would be nice to implement the ABC-MCMC methods with AbstractMCMC.jl, but for the ABC-SMC methods, the kernel-based methods and all the more recent methods based on function approximation (like the paper by Greenberg et al. I mentioned above) we need alternative approaches. EDIT:
What do you think? |
as far as i know, Turing has an SMC implementation which exploits the AbstractMCMC interface, we must look deeper |
Sorry for the late reply! So it's not quite clear to me why ABC-SMC would be an issue? As @francescoalemanno pointed out, we already have SMC samplers implemented in the AbstractMCMC interface. Is there a particular isse you have in mind @jbrea? And when it comes to function approximation ("emulation") methods, it should be fairly straight-forward to support by performing the initial fitting procedure in
I still need to get a better overview of the ABC-literature before I can properly judge which structure is a good idea, but it seems to me that the way to go would be:
Calling some more of the Turing folks: @cpfiffer @devmotion @xukai92. Know you guys have a lot on your hands, so feel free to ignore this ping right now; just figured it might be useful to bring someone with a bit more AbstractMCMC.jl experience into the discussion. |
@torfjelde, I should open with the disclaimer that I do not have a lot of experience implementing MCMC samplers. Nonetheless, I think AbstractMCMC has promise as a basic framework for ABC methods. One advantage is that AbstractMCMC provides the option of interfacing with Turing. In some cases, I overloaded some of the methods in AbstractMCMC. However, I think someone with more experience with MCMC samplers might be able to use default methods more creatively or define generic methods for particle samplers. |
I'll chime in and add some more comments. AbstractMCMC is not a Turing-specific thing, and actually designed in a way that packages without any Turing relation at all can get a nice interface and some default methods for sampling MCMC chains for free. This is achieved by defining Currently, we're preparing AbstractMCMC 2.0 (TuringLang/AbstractMCMC.jl#42). The user-facing For Turing integration, it should actually not matter if you use the AbstractMCMC interface or some other API. In any case we will have to transform/wrap the Turing models in a way that your samplers can deal with. The advantage of having a united and general API would be that probably less code and effort would be needed for Turing support of your samplers. I agree with @torfjelde's comments above that probably you would want to have a general model structure that is used by the different ABC methods and maybe also a abstract type for the ABC samplers. Additionally, I would also imagine that the different implementations of There's nothing special about SMC at all in Turing regarding the AbstractMCMC interface, although I admit that it's a little unintuitive at first. The number of iterations in SMC just corresponds to the number of particles, and we just return the first particle as first sample etc. Hence the SMC steps (i.e., propagating and reweighting particles) do not actually correspond to the As an example, in https://github.com/TuringLang/EllipticalSliceSampling.jl/blob/abstractmcmc2/src/abstractmcmc.jl you can see how EllipticalSliceSampling implements the AbstractMCMC 2.0 interface and thereby gets access to all the goodies and the sampling interface in AbstractMCMC. |
Thank you @devmotion , the new interface looks really neat, i look forward to updating my package to use it, as soon as AbMCMC 2 is released |
Thanks a lot @torfjelde, @itsdfish and @devmotion! Things are much clearer now. Let's go ahead with the AbstractMCMC 2.0 interface 😄. |
text copied from JuliaApproxInference/LikelihoodfreeInference.jl#5
i think this is a better place for this issue, i took the liberty of copying your markdown post @jbrea
ApproxInferenceBase.jl: a common API and some basic utilities
My proposition here is to write together a very light-weight
ApproxInferenceBase.jl
package that serves as a primary dependency of ABC packages. See for example DiffEqBase.jl or ReinforcementLearningBase.jl for how this is done in other eco-systems. I would include inApproxInferenceBase.jl
Ingredients
API
My proposition for the API is the following (I am biased of course, and I am very open to discussion!)
Additional to everything related to priors, summarys stats and metrics,
ApproxInferenceBase.jl
exports a functionfit!
with the following signatureEvery ABC package that relies on
ApproxInferenceBase.jl
extends thisfit!
function, e.g.The user provides models as callable objects (functions or functors) with one argument.
Constants are recommended to be handled with closures.
Extraction of summary statistics is done in the model.
For example
ABC methods/plans/setups are specified in the form
One master packages to access all methods
Similar in spirit to DifferentialEquations.jl we could create one package that aggregates all packages and gives unified access. The dependency graph would be something like
This package does nothing but reexport all the setups/methods defined in the
different packages and the
fit!
function. The name of this package should of course be discussed.ABCProblems.jl
I think it would be nice to have a package with typical ABC benchmark problems,
like the stochastic lotka-volterra problem, the blowfly problem etc. Maybe we
could collect them in a package
ABCProblems.jl
.New methods to be implemented
Here is an incomplete list of methods that I would love to see implemented in
julia. Together with a collection of benchmark problems one would get a nice box
to benchmark new methods we do research on.
Conclusions and Questions
Who would be up for such a collaborative effort?
How do you like my proposition for
ApproxInferenceBase.jl
? What would you change?Shall we create
ApproxInferenceBase.jl
,ABCProblems.jl
andABC.jl
? Or something similar with different names?The text was updated successfully, but these errors were encountered: