Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add gradient to uniform distribution #526

Merged
merged 5 commits into from
Sep 26, 2024
Merged

Add gradient to uniform distribution #526

merged 5 commits into from
Sep 26, 2024

Conversation

chaozg
Copy link
Contributor

@chaozg chaozg commented Sep 13, 2024

fixed #495 .

  • I have tested the implementation with MALA, ULA and NUTS from cuqi.experimental.mcmc for two demos (see below for MWE code);
  • sample goes well for all these three samplers
  • warmup from NUTS is always stuck at some point and I created a separate issue to keep track of it warmup of NUTS gets stuck with bounded priors #529

Demo 1: draw samples from Uniform(0, 1) with MALA, ULA and NUTS

import cuqi
import matplotlib.pyplot as plt
import numpy as np

uniform = cuqi.distribution.Uniform(np.array([0.0]), np.array([1.0]))
# sampler = cuqi.experimental.mcmc.MALA(uniform, initial_point=np.array([0.1]))
# sampler = cuqi.experimental.mcmc.ULA(uniform, initial_point=np.array([0.1]))
sampler = cuqi.experimental.mcmc.NUTS(uniform, initial_point=np.array([0.1]))
sampler.sample(10000)
samples = sampler.get_samples()
samples.plot_trace()

image

Demo 2: solve "Simplest" BIP with MALA, ULA and NUTS

import numpy as np
import matplotlib.pyplot as plt
import cuqi
from cuqi.utilities import plot_2D_density
np.random.seed(10)

# the forward model
A_matrix = np.array([[1.0, 1.0]])
A = cuqi.model.LinearModel(A_matrix)

# the prior
# x = Gaussian(np.zeros(2), 2.5)
x = cuqi.distribution.Uniform(np.array([-1.0, -1.0]), np.array([3.0, 3.0]))
print(x)

# the data distribution
b = cuqi.distribution.Gaussian(A@x, 0.1)

# the observed data
particular_x = np.array([1.5, 1.5])
b_given_particular_x = b(x=particular_x)
b_obs = b_given_particular_x.sample()
print(b_obs)

# the posterior
joint = cuqi.distribution.JointDistribution(x, b)
post = joint(b=b_obs)

# sampling with MALA, ULA and NUTS
# sampler = cuqi.experimental.mcmc.MALA(post, initial_point=np.array([2.5, 2.5]), scale=0.12)
# sampler = cuqi.experimental.mcmc.ULA(post, initial_point=np.array([2.5, 2.5]), scale=0.12)
sampler = cuqi.experimental.mcmc.NUTS(post, initial_point=np.array([2.5, 2.5]))

# sampler.warmup(1000)
sampler.sample(1000)
samples = sampler.get_samples()
samples.plot_trace()

# what b looks like with the drawn samples?
b_with_samples = cuqi.samples.Samples(samples.samples[:1,:] + samples.samples[1:,:])
plt.figure()
b_with_samples.plot_trace()

# plot exact posterior distribution and samples
# the posterior PDF
plt.figure()
plot_2D_density(post, -5, 5, -5, 5)
plt.title("Exact Posterior")

# samples
plt.figure()
samples.plot_pair()
plt.xlim(-5, 5)
plt.ylim(-5, 5)
plt.gca().set_aspect('equal')
plt.title("Posterior Samples")

image
The following plot shows what x1+x2 looks like with the drawn samples
image
image
image

cuqi/distribution/_uniform.py Outdated Show resolved Hide resolved
@nabriis
Copy link
Collaborator

nabriis commented Sep 13, 2024

Let us try an example to see if NUTS will perform reasonably. How about we attempt the case from here with a Uniform prior instead?
https://cuqi-dtu.github.io/CUQI-Book/chapter01/intro_example_short.html

@chaozg chaozg removed the request for review from amal-ghamdi September 13, 2024 14:33
@chaozg chaozg marked this pull request as draft September 13, 2024 16:50
@chaozg chaozg marked this pull request as ready for review September 13, 2024 17:39
@chaozg chaozg marked this pull request as draft September 13, 2024 17:55
@chaozg chaozg marked this pull request as ready for review September 24, 2024 16:02
@chaozg
Copy link
Contributor Author

chaozg commented Sep 24, 2024

Hi @nabriis , I've tried the update with the simplest BIP, and the result seems reasonable, so this PR is ready for your further review : )

Copy link
Contributor

@amal-ghamdi amal-ghamdi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you @chaozg. LGTM!

The simplest BIP example looks very nice, it could be a nice idea to add an issue in the Book repo to create an exercise to ask the reader to experiment with uniform prior (as part of the long notebook on the simplest BIP in the world).

@chaozg
Copy link
Contributor Author

chaozg commented Sep 24, 2024

Thank you, Amal. Done as suggested and issue is at CUQI-DTU/CUQI-Book#90

@chaozg chaozg requested a review from nabriis September 24, 2024 16:55
@nabriis
Copy link
Collaborator

nabriis commented Sep 24, 2024

Nice!

@chaozg chaozg merged commit d00b1c9 into main Sep 26, 2024
6 checks passed
@chaozg chaozg deleted the add_gradient_to_uniform branch September 26, 2024 08:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Implement gradient for uniform distribution
3 participants