Skip to content

Commit

Permalink
docs: move examples to tutorials, fix tutorial ordering (#1231)
Browse files Browse the repository at this point in the history
* move examples into tutorials, adapt docs and links

* reorder tutorials, fix links

* update implemented papers on landing page.
  • Loading branch information
janfb authored Aug 23, 2024
1 parent 72b3a7d commit 109d7e9
Show file tree
Hide file tree
Showing 25 changed files with 97 additions and 102 deletions.
5 changes: 2 additions & 3 deletions docs/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,12 +15,11 @@ locally, follow these steps:
```

2. Convert the current version of the documentation notebooks to markdown and build the
website locally using `mike`:
website locally using `mkdocs`:

```bash
jupyter nbconvert --to markdown ../tutorials/*.ipynb --output-dir docs/tutorials/
jupyter nbconvert --to markdown ../examples/*.ipynb --output-dir docs/examples/
mike serve
mkdocs serve
```

### Deployment
Expand Down
8 changes: 3 additions & 5 deletions docs/docs/contribute.md
Original file line number Diff line number Diff line change
Expand Up @@ -224,18 +224,16 @@ pip install -e ".[doc]"
Then, you can build the website locally by executing in the `docs` folder

```bash
mike serve
mkdocs serve
```

This will build the website on a local host address shown in the terminal. Changes to
the website files or a browser refresh will immediately rebuild the website.

If you want to build the latest version of the tutorial notebooks, you need to convert
them to markdown first:
If you updated the tutorials or examples, you need to convert them to markdown first:

```bash
cd docs
jupyter nbconvert --to markdown ../examples/*.ipynb --output-dir docs/examples/
jupyter nbconvert --to markdown ../tutorials/*.ipynb --output-dir docs/tutorials/
mike serve
mkdocs serve
```
2 changes: 0 additions & 2 deletions docs/docs/examples/.gitignore

This file was deleted.

9 changes: 9 additions & 0 deletions docs/docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -149,6 +149,15 @@ methods](tutorials/16_implemented_methods.md).
inference** <br> by Deistler, Goncalves & Macke (NeurIPS 2022)
<br>[[Paper]](https://arxiv.org/abs/2210.04815)

- **Flow matching for scalable simulation-based inference**<br> by Dax, M., Wildberger,
J., Buchholz, S., Green, S. R., Macke, J. H., & Schölkopf, B. (NeurIPS, 2023)<br>
[[Paper]](https://arxiv.org/abs/2305.17161)

- **Compositional Score Modeling for Simulation-Based Inference**<br> by Geffner, T.,
Papamakarios, G., & Mnih, A. (2023, July). Compositional score modeling for
simulation-based inference. (ICML 2023)<br>
[[Paper]](https://proceedings.mlr.press/v202/geffner23a.html)

### Likelihood-estimation (`(S)NLE`)

- **Sequential neural likelihood: Fast likelihood-free inference with
Expand Down
35 changes: 17 additions & 18 deletions docs/docs/tutorials/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,37 +23,36 @@ inference.
## Advanced

<div class="grid cards" markdown>
- [Multi-round inference](03_multiround_inference.md)
- [Sampling algorithms in sbi](11_sampler_interface.md)
- [Custom density estimators](04_density_estimators.md)
- [Embedding nets for observations](05_embedding_net.md)
- [SBI with trial-based data](14_iid_data_and_permutation_invariant_embeddings.md)
- [Handling invalid simulations](08_restriction_estimator.md)
- [Crafting summary statistics](10_crafting_summary_statistics.md)
- [Importance sampling posteriors](17_importance_sampled_posteriors.md)
- [Multi-round inference](02_multiround_inference.md)
- [Sampling algorithms in sbi](09_sampler_interface.md)
- [Custom density estimators](03_density_estimators.md)
- [Embedding nets for observations](04_embedding_networks.md)
- [SBI with trial-based data](12_iid_data_and_permutation_invariant_embeddings.md)
- [Handling invalid simulations](06_restriction_estimator.md)
- [Crafting summary statistics](08_crafting_summary_statistics.md)
- [Importance sampling posteriors](15_importance_sampled_posteriors.md)
</div>

## Diagnostics

<div class="grid cards" markdown>
- [Posterior predictive checks](12_diagnostics_posterior_predictive_check.md)
- [Simulation-based calibration](13_diagnostics_simulation_based_calibration.md)
- [Density plots and MCMC diagnostics with ArviZ](15_mcmc_diagnostics_with_arviz.md)
- [Local-C2ST coverage checks](18_diagnostics_lc2st.md)
- [Posterior predictive checks](10_diagnostics_posterior_predictive_checks.md)
- [Simulation-based calibration](11_diagnostics_simulation_based_calibration.md)
- [Local-C2ST coverage checks](13_diagnostics_lc2st.md)
- [Density plots and MCMC diagnostics with ArviZ](14_mcmc_diagnostics_with_arviz.md)
</div>


## Analysis

<div class="grid cards" markdown>
- [Conditional distributions](07_conditional_distributions.md)
- [Posterior sensitivity analysis](09_sensitivity_analysis.md)
- [Plotting functionality](19_plotting_functionality.md)
- [Conditional distributions](05_conditional_distributions.md)
- [Posterior sensitivity analysis](07_sensitivity_analysis.md)
- [Plotting functionality](17_plotting_functionality.md)
</div>

## Examples

<div class="grid cards" markdown>
- [Hodgkin-Huxley model](../examples/00_HH_simulator.md)
- [Decision-making model](../examples/01_decision_making_model.md)
- [Hodgkin-Huxley model](Example_00_HodgkinHuxleyModel.md)
- [Decision-making model](Example_01_DecisionMakingModel.md)
</div>
14 changes: 11 additions & 3 deletions tutorials/00_getting_started_flexible.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Note, you can find the original version of this notebook at [https://github.com/sbi-dev/sbi/blob/main/tutorials/00_getting_started_flexible.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/00_getting_started_flexible.ipynb) in the `sbi` repository."
"Note, you can find the original version of this notebook at [/tutorials/00_getting_started_flexible.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/00_getting_started_flexible.ipynb) in the `sbi` repository."
]
},
{
Expand Down Expand Up @@ -423,10 +423,18 @@
"source": [
"## Next steps\n",
"\n",
"To learn more about the capabilities of `sbi`, you can head over to the tutorial on [inferring parameters for multiple observations ](https://sbi-dev.github.io/sbi/tutorial/01_gaussian_amortized/) which introduces the concept of amortization. \n",
"To learn more about the capabilities of `sbi`, you can head over to the tutorial\n",
"[01_gaussian_amortized](01_gaussian_amortized.md), for inferring parameters for multiple\n",
"observations without retraining.\n",
"\n",
"Alternatively, for an example with an __actual__ simulator, you can read our [example for a scientific simulator from neuroscience](https://sbi-dev.github.io/sbi/examples/00_HH_simulator/)."
"Alternatively, for an example with an __actual__ simulator, you can read our example\n",
"for a scientific simulator from neuroscience under [Example_00_HodgkinHuxleyModel](Example_00_HodgkinHuxleyModel.md)."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": []
}
],
"metadata": {
Expand Down
6 changes: 3 additions & 3 deletions tutorials/01_gaussian_amortized.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Note, you can find the original version of this notebook at [https://github.com/sbi-dev/sbi/blob/main/tutorials/01_gaussian_amortized.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/01_gaussian_amortized.ipynb) in the `sbi` repository."
"Note, you can find the original version of this notebook at [tutorials/01_gaussian_amortized.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/01_gaussian_amortized.ipynb) in the `sbi` repository."
]
},
{
Expand Down Expand Up @@ -258,8 +258,8 @@
"source": [
"# Next steps\n",
"\n",
"Now that you got familiar with amortization and are probably good to go and have a first shot at applying `sbi` to your own inference problem. If you want to learn more, we recommend checking out our tutorial on\n",
"[multiround inference ](https://sbi-dev.github.io/sbi/tutorial/03_multiround_inference/) which aims to make inference for a single observation more sampling efficient."
"Now that you got familiar with amortization and are probably good to go and have a first shot at applying `sbi` to your own inference problem. If you want to learn more, we recommend checking out our tutorial\n",
"[02_multiround_inference](02_multiround_inference.md) which aims to make inference for a single observation more sampling efficient."
]
}
],
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Note, you can find the original version of this notebook at [https://github.com/sbi-dev/sbi/blob/main/tutorials/03_multiround_inference.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/03_multiround_inference.ipynb) in the `sbi` repository.\n"
"Note, you can find the original version of this notebook at [tutorials/02_multiround_inference.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/02_multiround_inference.ipynb) in the `sbi` repository.\n"
]
},
{
Expand Down Expand Up @@ -358,7 +358,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.19"
"version": "3.12.0"
}
},
"nbformat": 4,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -101,7 +101,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"It is also possible to pass an `embedding_net` to `posterior_nn()` which learn summary statistics from high-dimensional simulation outputs. You can find a more detailed tutorial on this [here](https://sbi-dev.github.io/sbi/tutorial/05_embedding_net/).\n"
"It is also possible to pass an `embedding_net` to `posterior_nn()` which learn summary\n",
"statistics from high-dimensional simulation outputs. You can find a more detailed\n",
"tutorial on this in [04_embedding_networks](04_embedding_networks.md).\n"
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
"# Embedding nets for observations\n",
"\n",
"!!! note\n",
" You can find the original version of this notebook at [tutorials/05_embedding_net.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/05_embedding_net.ipynb) in the `sbi` repository.\n",
" You can find the original version of this notebook at [tutorials/04_embedding_networks.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/04_embedding_networks.ipynb) in the `sbi` repository.\n",
"\n",
"## Introduction\n",
"\n",
Expand Down Expand Up @@ -481,7 +481,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.14"
"version": "3.12.0"
}
},
"nbformat": 4,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Note, you can find the original version of this notebook at [https://github.com/sbi-dev/sbi/blob/main/tutorials/07_conditional_distributions.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/07_conditional_distributions.ipynb) in the `sbi` repository.\n"
"Note, you can find the original version of this notebook at [tutorials/05_conditional_distributions.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/05_conditional_distributions.ipynb) in the `sbi` repository.\n"
]
},
{
Expand Down Expand Up @@ -26182,7 +26182,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Now we want to build the conditional potential (please read throught the [sampler interface tutorial](https://www.mackelab.org/sbi/tutorial/11_sampler_interface/) for an explanation of potential functions). For this, we have to pass a `condition`. In our case, we want to condition the forth parameter on $\\theta_4=0.2$. Regardless of how many parameters one wants to condition on, in `sbi`, one has to pass a `condition` value for all parameters. The first three values will simply be ignored. We can tell the algorithm which parameters should be kept fixed and which ones should be sampled with the argument `dims_to_sample`.\n"
"Now we want to build the conditional potential (please read throught the tutorial [09_sampler_interface](https://www.mackelab.org/sbi/tutorial/09_sampler_interface/) for an explanation of potential functions). For this, we have to pass a `condition`. In our case, we want to condition the forth parameter on $\\theta_4=0.2$. Regardless of how many parameters one wants to condition on, in `sbi`, one has to pass a `condition` value for all parameters. The first three values will simply be ignored. We can tell the algorithm which parameters should be kept fixed and which ones should be sampled with the argument `dims_to_sample`.\n"
]
},
{
Expand Down Expand Up @@ -26322,7 +26322,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.8"
"version": "3.12.0"
}
},
"nbformat": 4,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -378,7 +378,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.18"
"version": "3.12.0"
}
},
"nbformat": 4,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -471,7 +471,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.18"
"version": "3.12.0"
}
},
"nbformat": 4,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,12 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Many simulators produce outputs that are high-dimesional. For example, a simulator might generate a time series or an image. In a [previous tutorial](https://sbi-dev.github.io/sbi/tutorial/05_embedding_net/), we discussed how a neural networks can be used to learn summary statistics from such data. In this notebook, we will instead focus on hand-crafting summary statistics. We demonstrate that the choice of summary statistics can be crucial for the performance of the inference algorithm.\n"
"Many simulators produce outputs that are high-dimesional. For example, a simulator might\n",
"generate a time series or an image. In the tutorial on [04_embedding_networks](04_embedding_networks.md), we discussed how a\n",
"neural networks can be used to learn summary statistics from such data. In this\n",
"notebook, we will instead focus on hand-crafting summary statistics. We demonstrate that\n",
"the choice of summary statistics can be crucial for the performance of the inference\n",
"algorithm.\n"
]
},
{
Expand Down Expand Up @@ -781,7 +786,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.7"
"version": "3.12.0"
}
},
"nbformat": 4,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,6 @@
"source": [
"# Sampling algorithms in `sbi`\n",
"\n",
"Note: this tutorial requires that the user is already familiar with the [flexible interface](https://sbi-dev.github.io/sbi/tutorial/02_flexible_interface/).\n",
"\n",
"`sbi` implements three methods: SNPE, SNLE, and SNRE. When using SNPE, the trained neural network directly approximates the posterior. Thus, sampling from the posterior can be done by sampling from the trained neural network. The neural networks trained in SNLE and SNRE approximate the likelihood(-ratio). Thus, in order to draw samples from the posterior, one has to perform additional sampling steps, e.g. Markov-chain Monte-Carlo (MCMC). In `sbi`, the implemented samplers are:\n",
"\n",
"- Markov-chain Monte-Carlo (MCMC)\n",
Expand Down Expand Up @@ -358,7 +356,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.18"
"version": "3.12.0"
}
},
"nbformat": 4,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,16 @@
"source": [
"# Simulation-based Calibration in SBI\n",
"\n",
"After a density estimator has been trained with simulated data to obtain a posterior, the estimator should be made subject to several **diagnostic tests**. This needs to be performed before being used for inference given the actual observed data. _Posterior Predictive Checks_ (see [previous tutorial](https://sbi-dev.github.io/sbi/tutorial/12_diagnostics_posterior_predictive_check/)) provide one way to \"critique\" a trained estimator based on its predictive performance. Another important approach to such diagnostics is simulation-based calibration as developed by [Cook et al, 2006](https://www.tandfonline.com/doi/abs/10.1198/106186006X136976) and [Talts et al, 2018](https://arxiv.org/abs/1804.06788). This tutorial will demonstrate and teach you this technique with sbi.\n",
"After a density estimator has been trained with simulated data to obtain a posterior,\n",
"the estimator should be made subject to several **diagnostic tests**. This needs to be\n",
"performed before being used for inference given the actual observed data. _Posterior\n",
"Predictive Checks_ (see [10_diagnostics_posterior_predictive_checks\n",
"tutorial](10_diagnostics_posterior_predictive_checks.md)) provide one way to \"critique\" a trained\n",
"estimator based on its predictive performance. Another important approach to such\n",
"diagnostics is simulation-based calibration as developed by [Cook et al,\n",
"2006](https://www.tandfonline.com/doi/abs/10.1198/106186006X136976) and [Talts et al,\n",
"2018](https://arxiv.org/abs/1804.06788). This tutorial will demonstrate and teach you\n",
"this technique with sbi.\n",
"\n",
"**Simulation-based calibration** (SBC) provides a (qualitative) view and a quantitive measure to check, whether the variances of the posterior are balanced, i.e., neither over-confident nor under-confident. As such, SBC can be viewed as a necessary condition (but not sufficient) for a valid inference algorithm: If SBC checks fail, this tells you that your inference is invalid. If SBC checks pass, this is no guarantee that the posterior estimation is working.\n"
]
Expand Down Expand Up @@ -38,7 +47,7 @@
"\n",
"**SBC can inform us whether we are not wrong.** However, it cannot tell us whether we are right, i.e., SBC checks a necessary condition. For example, imagine you run SBC using the prior as a posterior. The ranks would be perfectly uniform. But the inference would be wrong as this scenario would only occur if the posterior is uninformative.\n",
"\n",
"**The Posterior Predictive Checks (see [tutorial 12](https://sbi-dev.github.io/sbi/tutorial/12_diagnostics_posterior_predictive_check/)) can be seen as the complementary sufficient check** for the posterior (only as a methaphor, no theoretical guarantees here). Using the prior as a posterior and then doing predictive checks would clearly show that inference failed.\n",
"**Posterior Predictive Checks can be seen as the complementary sufficient check** for the posterior (only as a methaphor, no theoretical guarantees here). Using the prior as a posterior and then doing predictive checks would clearly show that inference failed.\n",
"\n",
"To summarize, SBC can:\n",
"\n",
Expand Down Expand Up @@ -1115,11 +1124,6 @@
"interpreting TARP like SBC and complementing coverage checks with posterior predictive\n",
"checks."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": []
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -881,13 +881,6 @@
" fontsize=12,\n",
");"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
Expand All @@ -906,7 +899,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.14"
"version": "3.12.0"
},
"toc": {
"base_numbering": 1,
Expand Down
Loading

0 comments on commit 109d7e9

Please sign in to comment.