SNLE is more unstable to SNPE when inferring on the observe data #1572
Unanswered
pardocelsa
asked this question in
Q&A
Replies: 1 comment 1 reply
-
Hi! I don't think that one can generally say that SNLE is more sensitive to OOD data. It might well be the case for your specific simulator though. We have recently added functionality to detect misspecification (you have to use the GitHub version of Hope this helps! |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I'm observing unstable behaviour in the posterior when using SNLE with PCA compression on real (observed) data, but not when using simulated data or when using SNPE. This suggests that SNLE might be more sensitive to out-of-distribution data.
To better understand this issue, I have done the following experiments using the same PCA-based compression method across both SNLE and SNPE:
SNLE on observed data: I'm trying to infer 7 parameters using SNLE, with an ensemble of 5 neural networks over 3 rounds. On the observed data, I see bimodality in the posterior, each network in the ensemble yields a different result. The only difference between networks is their random weight initialization. Here’s how the marginal posterior distributions look across rounds:

SNLE on simulated data: When I apply the same SNLE procedure to a test sample generated by the simulator, the inference performs well: there's no bimodality, and the results are stable.

SNPE on observed data (same compression): Finally, I used the same PCA-compressed outputs within an SNPE approach to infer the observed data. In this case, the posterior distributions look smooth, with no bimodality, and they converge nicely across rounds.

This makes me think that SNLE might be more sensitive to out-of-distribution data. Since our simulator is not perfect, it's likely that the observed data lies somewhat outside the simulated training distribution.
Have you encountered this behaviour before?
Beta Was this translation helpful? Give feedback.
All reactions