Skip to content

Commit

Permalink
Update Sphinx documentation, commit 697a5eb [skip ci].
Browse files Browse the repository at this point in the history
  • Loading branch information
bluescarni committed Nov 28, 2023
1 parent 8d63617 commit 82a42fa
Show file tree
Hide file tree
Showing 100 changed files with 5,671 additions and 1,762 deletions.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file not shown.
Binary file not shown.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file not shown.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file not shown.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file not shown.
3 changes: 3 additions & 0 deletions _sources/examples_ml.rst
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,9 @@ Machine Learning
.. toctree::
:maxdepth: 1

notebooks/ffnn
notebooks/torch_and_heyoka
notebooks/NeuralHamiltonianODEs
notebooks/NeuralODEs
notebooks/differentiable_atmosphere

629 changes: 26 additions & 603 deletions _sources/notebooks/NeuralHamiltonianODEs.ipynb

Large diffs are not rendered by default.

8 changes: 4 additions & 4 deletions _sources/notebooks/NeuralODEs.ipynb

Large diffs are not rendered by default.

800 changes: 800 additions & 0 deletions _sources/notebooks/differentiable_atmosphere.ipynb

Large diffs are not rendered by default.

589 changes: 589 additions & 0 deletions _sources/notebooks/ffnn.ipynb

Large diffs are not rendered by default.

270 changes: 270 additions & 0 deletions _sources/notebooks/torch_and_heyoka.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,270 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Interfacing *torch* to *heyoka.py*\n",
"\n",
"```{note}\n",
"For an introduction on feed forward neural networks in *heyoka.py*, check out the example: [Feed-Forward Neural Networks](<./ffnn.ipynb>).\n",
"```\n",
"\n",
"\n",
"```{warning}\n",
"This tutorial assumes [torch](https://pytorch.org/) is installed\n",
"```\n",
"\n",
"*heyoka.py* is not a library meant for machine learning, nor it aspires to be one. However, given its support for feed-forward neural networks and their potential use in numerical integration, it is useful to connect the *heyoka.py* `ffnn()` factory to a torch model. This tutorial tackles this! "
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"import heyoka as hk\n",
"\n",
"import numpy as np\n",
"\n",
"# We will need torch for this notebook. The CPU version is enough though.\n",
"import torch\n",
"from torch import nn"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We start defining a `ffnn` model in *torch*. We use as a test-case, a feed-forward neural network with two hidden layers having 32 neurons each and use `tanh` as nonlinearities and a `sigmoid` for the output layer. We define the model as to map it easily to the *heyoka* `ffnn` factory, but other styles are also possible.\n",
"\n",
"This will typically look something like:"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"#Let us use float64 (this is not necessary as heyoka has also the float32 type, but we go for high precision here!)\n",
"torch.set_default_dtype(torch.float64)\n",
"\n",
"class torch_net(nn.Module):\n",
" def __init__(\n",
" self,\n",
" n_inputs=4,\n",
" nn_hidden=[32, 32],\n",
" n_out=1,\n",
" activations=[nn.Tanh(), nn.Tanh(), nn.Sigmoid()]\n",
" ):\n",
" super(torch_net, self).__init__()\n",
"\n",
" # We treat all layers equally.\n",
" dims = [n_inputs] + nn_hidden + [n_out]\n",
"\n",
" # Linear function.\n",
" self.fcs = nn.ModuleList([nn.Linear(dims[i], dims[i + 1]) for i in range(len(dims) - 1)])\n",
"\n",
" # Non-linearities.\n",
" self.acts = nn.ModuleList(activations)\n",
"\n",
" def forward(self, x):\n",
" for fc, act in zip(self.fcs, self.acts):\n",
" x = act(fc(x))\n",
" return x\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Weights and biases are stored by *torch* in the model as arrays, while *heyoka* flattens everything into a one-dimensional sequence containing all weights first, then all biases.\n",
"\n",
"The following function takes care of converting the *torch* representation to *heyoka*'s: "
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [],
"source": [
"def weights_and_biases_heyoka(model):\n",
" weights = {}\n",
" biases = {}\n",
"\n",
" for name, param in model.named_parameters():\n",
" if 'weight' in name:\n",
" weights[name] = param.data.clone()\n",
" elif 'bias' in name:\n",
" biases[name] = param.data.clone()\n",
" biases_torch=[]\n",
" weights_torch=[]\n",
" for idx in range(len(weights)):\n",
" weights_torch.append(weights[list(weights.keys())[idx]].numpy())\n",
" biases_torch.append(biases[list(biases.keys())[idx]].numpy())\n",
" \n",
" w_flat=[]\n",
" b_flat=[]\n",
" for i in range(len(weights_torch)):\n",
" w_flat+=list(weights_torch[i].flatten())\n",
" b_flat+=list(biases_torch[i].flatten())\n",
" w_flat=np.array(w_flat)\n",
" b_flat=np.array(b_flat)\n",
" print(w_flat.shape)\n",
" return np.concatenate((w_flat, b_flat))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We are now ready to instantiate the model and extract its weights and biases ready for constructing an `heyoka.ffnn` object:"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"(1184,)\n"
]
}
],
"source": [
"model = torch_net(n_inputs=4, \n",
" nn_hidden=[32, 32],\n",
" n_out=1,\n",
" activations=[nn.Tanh(), nn.Tanh(), nn.Sigmoid()])\n",
"\n",
"# Here one would likely perform some training ... at the end, we extract the model parameters:\n",
"flattened_weights = weights_and_biases_heyoka(model)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Let us instantiate the feed forward neural network in *heyoka.py* using those parameters:"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [],
"source": [
"inp_1, inp_2, inp_3, inp_4=hk.make_vars(\"inp_1\",\"inp_2\",\"inp_3\",\"inp_4\")\n",
"model_heyoka=hk.model.ffnn(inputs=[inp_1, inp_2, inp_3, inp_4], \n",
" nn_hidden=[32,32], \n",
" n_out=1,\n",
" activations=[hk.tanh,hk.tanh,hk.sigmoid], \n",
" nn_wb=flattened_weights)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Good! Just to be sure, we now verify the output is the same at inference? Let's first compile the network so that we can run inference: "
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {},
"outputs": [],
"source": [
"model_heyoka_compiled=hk.make_cfunc(model_heyoka)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"... and create some random inputs"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [],
"source": [
"random_input=torch.rand((4,1000000))\n",
"random_input_torch=random_input.t()\n",
"random_input_numpy=random_input.numpy()\n",
"out_array=np.zeros((1,1000000))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now, let's compare the output of `heyoka.ffnn` and `torch` to see if they are identical"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [],
"source": [
"hey = model_heyoka_compiled(random_input_numpy,outputs=out_array)\n",
"torch = model(random_input_torch).detach().numpy().reshape(1,-1)"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Maximum difference in the inference is: 2.220446049250313e-16\n"
]
}
],
"source": [
"print(\"Maximum difference in the inference is: \", (hey-torch).max())"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"In this way we have managed to port the *torch* model in *heyoka.py*, reproducing the same results... "
]
}
],
"metadata": {
"kernelspec": {
"display_name": "diff_drag_py38",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.6"
},
"orig_nbformat": 4
},
"nbformat": 4,
"nbformat_minor": 2
}
2 changes: 1 addition & 1 deletion _static/scripts/pydata-sphinx-theme.js

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion _static/scripts/pydata-sphinx-theme.js.map

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion _static/styles/pydata-sphinx-theme.css

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion _static/styles/pydata-sphinx-theme.css.map

Large diffs are not rendered by default.

18 changes: 9 additions & 9 deletions _static/webpack-macros.html
Original file line number Diff line number Diff line change
Expand Up @@ -4,28 +4,28 @@
-->
{# Load FontAwesome icons #}
{% macro head_pre_icons() %}
<link href="{{ pathto('_static/vendor/fontawesome/6.1.2/css/all.min.css', 1) }}?digest=365ca57ee442770a23c6" rel="stylesheet" />
<link href="{{ pathto('_static/vendor/fontawesome/6.1.2/css/all.min.css', 1) }}?digest=5b4479735964841361fd" rel="stylesheet" />
<link rel="preload" as="font" type="font/woff2" crossorigin href="{{ pathto('_static/vendor/fontawesome/6.1.2/webfonts/fa-solid-900.woff2', 1) }}" />
<link rel="preload" as="font" type="font/woff2" crossorigin href="{{ pathto('_static/vendor/fontawesome/6.1.2/webfonts/fa-brands-400.woff2', 1) }}" />
<link rel="preload" as="font" type="font/woff2" crossorigin href="{{ pathto('_static/vendor/fontawesome/6.1.2/webfonts/fa-regular-400.woff2', 1) }}" />
{% endmacro %}

{% macro head_pre_assets() %}
<!-- Loaded before other Sphinx assets -->
<link href="{{ pathto('_static/styles/theme.css', 1) }}?digest=365ca57ee442770a23c6" rel="stylesheet" />
<link href="{{ pathto('_static/styles/bootstrap.css', 1) }}?digest=365ca57ee442770a23c6" rel="stylesheet" />
<link href="{{ pathto('_static/styles/pydata-sphinx-theme.css', 1) }}?digest=365ca57ee442770a23c6" rel="stylesheet" />
<link href="{{ pathto('_static/styles/theme.css', 1) }}?digest=5b4479735964841361fd" rel="stylesheet" />
<link href="{{ pathto('_static/styles/bootstrap.css', 1) }}?digest=5b4479735964841361fd" rel="stylesheet" />
<link href="{{ pathto('_static/styles/pydata-sphinx-theme.css', 1) }}?digest=5b4479735964841361fd" rel="stylesheet" />
{% endmacro %}

{% macro head_js_preload() %}
<!-- Pre-loaded scripts that we'll load fully later -->
<link rel="preload" as="script" href="{{ pathto('_static/scripts/bootstrap.js', 1) }}?digest=365ca57ee442770a23c6" />
<link rel="preload" as="script" href="{{ pathto('_static/scripts/pydata-sphinx-theme.js', 1) }}?digest=365ca57ee442770a23c6" />
<script src="{{ pathto('_static/vendor/fontawesome/6.1.2/js/all.min.js', 1) }}?digest=365ca57ee442770a23c6"></script>
<link rel="preload" as="script" href="{{ pathto('_static/scripts/bootstrap.js', 1) }}?digest=5b4479735964841361fd" />
<link rel="preload" as="script" href="{{ pathto('_static/scripts/pydata-sphinx-theme.js', 1) }}?digest=5b4479735964841361fd" />
<script src="{{ pathto('_static/vendor/fontawesome/6.1.2/js/all.min.js', 1) }}?digest=5b4479735964841361fd"></script>
{% endmacro %}

{% macro body_post() %}
<!-- Scripts loaded after <body> so the DOM is not blocked -->
<script src="{{ pathto('_static/scripts/bootstrap.js', 1) }}?digest=365ca57ee442770a23c6"></script>
<script src="{{ pathto('_static/scripts/pydata-sphinx-theme.js', 1) }}?digest=365ca57ee442770a23c6"></script>
<script src="{{ pathto('_static/scripts/bootstrap.js', 1) }}?digest=5b4479735964841361fd"></script>
<script src="{{ pathto('_static/scripts/pydata-sphinx-theme.js', 1) }}?digest=5b4479735964841361fd"></script>
{% endmacro %}
Loading

0 comments on commit 82a42fa

Please sign in to comment.