-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add (iid) truncated normal and also add grad to normal #542
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you @chaozg for adding this very useful distribution! I added some comments, feel free to address them as you see fit.
Co-authored-by: amal-ghamdi <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Really nice @chaozg. I agree with Amals suggestions and I had a few of my own.
|
||
class TruncatedNormal(Distribution): | ||
""" | ||
Truncated Normal probability distribution. Generates instance of cuqi.distribution.TruncatedNormal. It allows the user to specify upper and lower bounds on random variables represented by a Normal distribution. This distribution is suitable for a small dimension setup (e.g. `dim`=3 or 4). Using TruncatedNormal Distribution with a larger dimension can lead to a high rejection rate when used within MCMC samplers. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you break this line into smaller pieces?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done as suggested
|
||
def _sample(self,N=1, rng=None): | ||
""" | ||
Generates random samples from the distribution. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perhaps for the direct sampling of the distribution we could use a simple "Rejection" sampling? If asked for "N" samples, we sample a Gaussian, say "N" times, remove out of bounds, sample another "N" times, remote out of bounds. If we have now "N" or more sample we return the first N, else we repeat until we get "N" samples. Could also be used to compare with the samplers you showed in the code.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I just added_sample
with a slightly different strategy: I generate samples one by one, check if it is within the bounds, until we get N samples. Do you think strategy is OK?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
just a comment here, would it be possible just to pass the rng to sample = self._normal.sample(rng=rng)
?
from scipy.special import erf | ||
from cuqi.distribution import Distribution | ||
|
||
class TruncatedNormal(Distribution): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You made this really nice showcases in the PR @chaozg. I suggest you add one or two of them as "Examples" in the docstring!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added a example in the docstring as suggested
Co-authored-by: Nicolai André Brogaard Riis <[email protected]>
Thanks @amal-ghamdi and @nabriis for your review. I think I have addressed your comments so I'm requesting your further review here. Note that the existing Normal lacks grad() #548 , so I also add it in this PR. Note that sample() implemented lacks a way to carry rng, and I'm not sure what's the best way to solve this.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Many thanks @chaozg for the updates! it is nice that the distribution is based on Normal now and that you added the gradient for the Normal distribution. I have only few comments to consider as you see fit.
def gradient(self, x): | ||
return -(x-self.mean)/(self.std**2) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a nice bonus that we have a gradient for normal now :). Just one suggestion to add two checks:
1- the geometry is not a geometry that needs a chain rule.
2- The distribution is used as prior, not likelihood because we do not account for the chain rule here.
For example, the gradient implementation of cmrf does these two checks
def _gradient(self, val, **kwargs):
#Avoid complicated geometries that change the gradient.
if not type(self.geometry) in _get_identity_geometries():
raise NotImplementedError("Gradient not implemented for distribution {} with geometry {}".format(self,self.geometry))
if not callable(self.location): # for prior
diff = self._diff_op._matrix @ val
return (-2*diff/(diff**2+self.scale**2)) @ self._diff_op._matrix
else:
warnings.warn('Gradient not implemented for {}'.format(type(self.location)))
|
||
def _sample(self,N=1, rng=None): | ||
""" | ||
Generates random samples from the distribution. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
just a comment here, would it be possible just to pass the rng to sample = self._normal.sample(rng=rng)
?
if np.all(point >= low) and np.all(point <= high): | ||
assert x_trun.logpdf(point) == approx(x.logpdf(point)) | ||
else: | ||
assert np.isinf(x_trun.logpdf(point)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
a suggestion to use np.isneginf
instead of np.isinf
here, to check for negative infinity.
fixed #321
fixed #548 (grad of Normal was missing)
With this PR, we can draw samples from truncated iid normal distributions, i.e., truncated version of
cuqi.distribution.normal
or use them as priors.demo 1: 2D normal distribution N([0; 0], [1, 0; 0, 1]) truncated in square domain [-2,2]*[-2,2]
demo 2: bottom right quarter of 2D normal distribution N([0; 0], [1, 0; 0, 1])
demo 3: simplest BIP