-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[DRAFT] Importance sampling #47
Conversation
Codecov ReportAttention:
❗ Your organization needs to install the Codecov GitHub app to enable full functionality. Additional details and impacted files@@ Coverage Diff @@
## master #47 +/- ##
==========================================
+ Coverage 81.07% 82.35% +1.27%
==========================================
Files 5 5
Lines 539 578 +39
==========================================
+ Hits 437 476 +39
Misses 102 102
☔ View full report in Codecov by Sentry. |
Hi @yallup, this looks very cool! We discussed a bit offline but to record some of my thoughts on this. My understanding is that you want to perform integration using importance sampling between the trained flow where but not 100% sure. Am I correct in my interpretation of the efficiency as a measure of how accurate the flow is? We define the weights as It's not in any way a measure of how efficient the nested sampling run is? Which is what we get when we calculate |
Could we move Ohh and write a test to check that the evidence recovered from |
Test added and files removed. Wasn't sure how you wanted to approach documenting this in the tutorial so I just left that for now (probably to be added after we decide if this works) |
@yallup sure no worries we can add a tutorial later down the line. Can we bump the version number to 1.2.0 here? Needs doing in readme and setup.py. I just changed the KL divergence function so that it returns a dictionary rather than a pandas table. Bit nicer to use and more consistent with the new importance sampling function you have added here. |
Ohh I think master branch might need merging here too... |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me. Go ahead and squash and merge when you get a chance! Thanks @yallup! 🚀
This PR implements importance sampling from a margarine flow. This is something like neural importance sampling only with nested sampling to do the information acquisition, neural importance nested sampling I guess?
Workflow as follows:
run_pypolychord.py
- A copy of the standard pypolychord script, 4D gaussian, restricted to a hypercube prior for now for simplicity.train_maf.py
- Trains a margarine MAF on the polychord run.importance.py
- Uses the trained MAF to importance sample the original likelihood again. I've added amintegrate
function to themargarine.marginal_stats.calculate
class to do this:This may have broader use (provided I've done this right) as an afterburner to improve nested sampling error estimate for moderate dimension problems. To be discussed, and tested to see if this actually works as well as I claim!