-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement truncated normal distribution #78
Comments
Ha ha @martinjankowiak and I were just talking about this last night 😄 PyTorch has a differentialbe |
Can we make it more general than TruncatedNormal? I had written it in the design doc but we didn't discuss it, and honestly I don't know if there's any other way that explicitly providing truncated pdfs |
Reparameterization is tricky for bounded distributions. I see a few possible ways to handle this in a more general way:
|
I think it's a good idea to implement |
Even if we have For example, I think for truncated normal we could use inverse transform sampling if we are within ±5 standard deviations (for Double anyway, it's probably less for Float…). But, if we are sampling from the tails (say, a standard normal truncated to have a lower bound of 6) we'd need to do something else. (e.g. I believe there's a rejection sampling algorithm for this in Devroye's book.) |
Hmm if we use rejection sampling to construct a sample, can we use
where |
Something along those lines, yes — we just need to be able to re-normalize the part of the distribution which remains after truncation. There is still the problem of tails. For example, suppose I would like to sample from a standard normal distribution with bounds The Maybe this is a bit of an edge case…? |
I recently found myself wanting a truncated normal distribution (e.g. a univariate normal which has parameters for specifying minimum and/or maximum values).
Is this a sufficiently common need to include here? If so, I'll volunteer.
The text was updated successfully, but these errors were encountered: