Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not exploring full prior space #253

Open
sumitaghosh opened this issue May 24, 2024 · 2 comments
Open

Not exploring full prior space #253

sumitaghosh opened this issue May 24, 2024 · 2 comments

Comments

@sumitaghosh
Copy link

sumitaghosh commented May 24, 2024

I have a likelihood that has a linear relationship with 39 parameters and involves creating an 89x40x3 array then summing its elements together. Back when I only had the 89x40 array, the best-fit worked fine. Now, it's converging on a wrong value. There is a certain combination of parameters (within the prior space and not at the edge) that gives me a likelihood of -38565.34122985974, but PyMultiNest never seems to find any data points that give a likelihood of above -40k. I have tried pymultinest.solve with n_live_points=2000 and sampling_efficiency=1.0 but that still leads to a weird behavior of never getting a higher likelihood and of "Parameter 1 of mode 1 is converging towards the edge of the prior." What else can I try? I can't give a minimum working example because it only seems to break like this with the entire thing, so is there any other information that would be helpful here?

@JohannesBuchner
Copy link
Owner

JohannesBuchner commented May 25, 2024

Don't forget that Bayesian inference is sampling the posterior, not optimizing the likelihood. If such a peak is a tiny fraction of the prior space, and you have other huge ranges of the prior space with moderate likelihood, then the small peak may not be important to include because it does not contribute to the integration.

What you can do:

  • check your likelihood: is it returning a high value at the location you expect?
  • check the continuity: from the highest likelihood point it finds to the peak point you expect, draw a line and compute the likelihood on 10000 points along that line
  • prior predictive checks: draw samples from the prior, visualise the model there. Think whether the sampled models are reasonable to have equal weights. If not, you may want to change your prior.

@sumitaghosh
Copy link
Author

Thank you!

I'm not sure if I understood the last bullet point you made, but I did change the prior slightly and was able to get a higher likelihood. The first and second bullets can probably be summarized in this plot:

Screenshot 2024-05-26 at 11 32 12 AM

I'm also not sure how the analyzer chooses to stick around at a place. Here's the likelihood for every step in the chain (sequentially) from another run where I imposed more constraints, and you can see that it's smooth for a long time (why?) and then it suddenly decides to explore locally but only for a short period of time. How does it choose where to stop and explore?
Screenshot 2024-05-26 at 1 18 48 PM

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants