-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add gradient to uniform distribution #526
Conversation
Let us try an example to see if NUTS will perform reasonably. How about we attempt the case from here with a Uniform prior instead? |
Hi @nabriis , I've tried the update with the simplest BIP, and the result seems reasonable, so this PR is ready for your further review : ) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you @chaozg. LGTM!
The simplest BIP example looks very nice, it could be a nice idea to add an issue in the Book repo to create an exercise to ask the reader to experiment with uniform prior (as part of the long notebook on the simplest BIP in the world).
Thank you, Amal. Done as suggested and issue is at CUQI-DTU/CUQI-Book#90 |
Nice! |
fixed #495 .
cuqi.experimental.mcmc
for two demos (see below for MWE code);sample
goes well for all these three samplerswarmup
from NUTS is always stuck at some point and I created a separate issue to keep track of it warmup of NUTS gets stuck with bounded priors #529Demo 1: draw samples from Uniform(0, 1) with MALA, ULA and NUTS
Demo 2: solve "Simplest" BIP with MALA, ULA and NUTS
The following plot shows what x1+x2 looks like with the drawn samples