Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix and extension to the way that PolyChord treats priors #233

Open
wants to merge 4 commits into
base: master
Choose a base branch
from

Conversation

williamjameshandley
Copy link
Contributor

In contrast to #232 and #231, this is a slightly more complicated/controversial PR.

The way priors are treated in nested sampling is quite subtle, in particular the way that it makes a clear separation between prior and likelihood. This was first covered by Chen, Feroz & Hobson, and is something I'm exploring with @appetrosyan as part of his supernest package, and in a soon-to-be-released paper. This has been discussed partially in #77

In implementing a prior, one can either incorporate it as an additional density term added to the loglikelihood, or as a transformation from the unit Hypercube. MCMC approaches see these as the same, but nested sampling views them quite differently. Both recover the same posterior and evidence, although one will get different KL divergences and different runtimes/accuracy (as a rule of thumb, the more information you can put in the prior, the lower the DKL, the faster the compression from prior to posterior, the more accurate the nested sampling)

This PR implements how I would establish this in cobaya, although discussion would be useful to see if I've missed any subtleties. It:

  1. sums the loglikelihoods for the loglikelihood calculation (ignoring the prior density terms)
  2. implements priors (uniform and gaussian and beyond) via scipy ppf functions, recording the prior density as before as derived parameters
  3. removes the 'volume' correction, which I believe is now redundant.

The only thing this therefore fails at is the treatment of the SZ prior for the plik TT|TTTEEE likelihood. In this framework, you have to manually add this in as a likelihood rather than a prior density, in the exact same way as you do for priors, e.g.

likelihood:
  planck_2018_highl_plik.TTTEEE: null
  planck_2018_highl_plik.SZ: 'lambda ksz_norm, A_sz: stats.norm.logpdf(ksz_norm+1.6*A_sz, loc=9.5, scale=3.0)'

It would be great if cobaya-cosmo-generator could do this -- any hints on how I could implement this?

@williamjameshandley williamjameshandley changed the title Fix to the way that PolyChord treats priors Fix and extension to the way that PolyChord treats priors Feb 11, 2022
@JesusTorrado
Copy link
Contributor

Thanks! It was indeed somewhere in the to-do list, as suggested by @lukashergt, so we'll merge it after some work.

As for the "extra" priors, I am sure we will find a way to keep them as priors semantically in the input, but to have the PolyChord interface deal with the difference. As said somewhere else, I'll come back to this soon!

@codecov-commenter
Copy link

codecov-commenter commented Feb 11, 2022

Codecov Report

Merging #233 (acda48a) into master (d945838) will decrease coverage by 0.01%.
The diff coverage is 0.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #233      +/-   ##
==========================================
- Coverage   87.88%   87.87%   -0.02%     
==========================================
  Files          92       92              
  Lines        8335     8335              
==========================================
- Hits         7325     7324       -1     
- Misses       1010     1011       +1     
Impacted Files Coverage Δ
cobaya/samplers/polychord/polychord.py 32.20% <0.00%> (ø)
cobaya/samplers/mcmc/mcmc.py 90.23% <0.00%> (-0.24%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update d945838...acda48a. Read the comment docs.

@williamjameshandley
Copy link
Contributor Author

williamjameshandley commented Feb 11, 2022

I did have a bit of a think about how to automate this, but it wasn't immediately obvious to me the best way for polychord to discover the manual prior density structure. Any hints would be much appreciated, as I agree that the most user friendly solution would be for yaml specified prior densities to be treated as likelihoods from polychord's perspective (although the current way is more explicit)

@lukashergt
Copy link
Contributor

lukashergt commented Feb 11, 2022

This PR is basically an update of #104.

Would be good to copy the Latex correction from there, i.e. here, this applies to the old line 263 (new line 251). It should be \log\pi instead of \pi.

@lukashergt
Copy link
Contributor

lukashergt commented Feb 11, 2022

The only thing this therefore fails at is the treatment of the SZ prior for the plik TT|TTTEEE likelihood. In this framework, you have to manually add this in as a likelihood rather than a prior density, in the exact same way as you do for priors, e.g.

likelihood:
  planck_2018_highl_plik.TTTEEE: null
  planck_2018_highl_plik.SZ: 'lambda ksz_norm, A_sz: stats.norm.logpdf(ksz_norm+1.6*A_sz, loc=9.5, scale=3.0)'

Regarding multidimensional priors it is worth noting that excluding part of the parameter space still works at the prior level.

For example it still works to restrict one parameter x to be greater than another parameter y by adding the following to your yaml file:

prior:
    exclude_prior: "lambda x, y: np.log(x > y)"

But if you want to add some finite logpdf value, you will have to add it to the loglikelihood instead of the logprior with this PR.

@williamjameshandley
Copy link
Contributor Author

Many thanks @lukashergt for pointing out these links. This therefore fixes #101, fixes #103 and closes #104.

lukashergt added a commit to lukashergt/cobaya that referenced this pull request Nov 27, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants