Skip to content

Commit fd99baf

Browse files
authored
Enable automatic versioning via setuptools_scm (#2111)
* Enable automatic versioning via setuptools_scm Based on cornellius-gp/linear_operator#2 * Resolve RTD warnings from example notebooks
1 parent ed9d662 commit fd99baf

File tree

16 files changed

+84
-34
lines changed

16 files changed

+84
-34
lines changed

.github/workflows/deploy.yml

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,12 +23,13 @@ jobs:
2323
- name: Install dependencies
2424
run: |
2525
python -m pip install --upgrade pip
26-
pip install setuptools wheel twine
26+
pip install setuptools_scm setuptools wheel twine
2727
- name: Build and publish
2828
env:
2929
TWINE_USERNAME: ${{ secrets.PYPI_USERNAME }}
3030
TWINE_PASSWORD: ${{ secrets.PYPI_PASSWORD }}
3131
run: |
32+
python -m setuptools_scm
3233
python setup.py sdist bdist_wheel
3334
twine upload dist/*
3435
@@ -44,11 +45,13 @@ jobs:
4445
- name: Install dependencies
4546
run: |
4647
conda install -y anaconda-client conda-build
48+
pip install setuptools_scm
4749
- name: Build and publish
4850
run: |
4951
conda config --set anaconda_upload yes
5052
conda config --append channels pytorch
5153
/usr/share/miniconda/bin/anaconda login --username ${{ secrets.CONDA_USERNAME }} --password ${{ secrets.CONDA_PASSWORD }}
54+
python -m setuptools_scm
5255
cd .conda
5356
conda build .
5457
/usr/share/miniconda/bin/anaconda logout

.gitignore

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -153,3 +153,6 @@ Temporary Items
153153
# Data files that might be downloaded
154154
**/*.tar.gz
155155
**/*.txt
156+
157+
# version file (auto-generated by setuptools_scm)
158+
gpytorch/version.py

.readthedocs.yml

Lines changed: 8 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,14 @@
55
# Required
66
version: 2
77

8+
build:
9+
os: "ubuntu-22.04"
10+
tools:
11+
python: "3.8"
12+
jobs:
13+
pre_build:
14+
- python -m setuptools_scm
15+
816
# Build documentation in the docs/ directory with Sphinx
917
sphinx:
1018
configuration: docs/source/conf.py
@@ -16,7 +24,6 @@ sphinx:
1624

1725
# Optionally set the version of Python and requirements required to build your docs
1826
python:
19-
version: 3.8
2027
install:
2128
- requirements: requirements.txt
2229
- requirements: docs/requirements.txt

docs/requirements.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,4 @@
1+
setuptools_scm
12
nbformat
23
ipython
34
ipykernel

docs/source/conf.py

Lines changed: 7 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -28,11 +28,13 @@ def read(*names, **kwargs):
2828

2929

3030
def find_version(*file_paths):
31-
version_file = read(*file_paths)
32-
version_match = re.search(r"^__version__ = ['\"]([^'\"]*)['\"]", version_file, re.M)
33-
if version_match:
31+
try:
32+
with io.open(os.path.join(os.path.dirname(__file__), *file_paths), encoding="utf8") as fp:
33+
version_file = fp.read()
34+
version_match = re.search(r"^__version__ = version = ['\"]([^'\"]*)['\"]", version_file, re.M)
3435
return version_match.group(1)
35-
raise RuntimeError("Unable to find version string.")
36+
except Exception as e:
37+
raise RuntimeError("Unable to find version string:\n", e)
3638

3739

3840
sys.path.append(os.path.abspath(os.path.join(__file__, "..", "..", "..")))
@@ -63,7 +65,7 @@ def find_version(*file_paths):
6365
author = "Cornellius GP"
6466

6567
# The short X.Y version
66-
version = find_version("gpytorch", "__init__.py")
68+
version = find_version("..", "..", "gpytorch", "version.py")
6769
# The full version, including alpha/beta/rc tags
6870
release = version
6971

examples/00_Basic_Usage/Metrics.ipynb

Lines changed: 7 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -12,10 +12,12 @@
1212
"\n",
1313
"We'll be modeling the function\n",
1414
"\n",
15+
"$$\n",
1516
"\\begin{align}\n",
16-
"y &= \\sin(2\\pi x) + \\epsilon \\\\\n",
17+
" y &= \\sin(2\\pi x) + \\epsilon \\\\\n",
1718
" \\epsilon &\\sim \\mathcal{N}(0, 0.04) \n",
18-
"\\end{align}"
19+
"\\end{align}\n",
20+
"$$\n"
1921
]
2022
},
2123
{
@@ -441,7 +443,7 @@
441443
"hash": "d4d1e4263499bec80672ea0156c357c1ee493ec2b1c70f0acce89fc37c4a6abe"
442444
},
443445
"kernelspec": {
444-
"display_name": "Python 3.8.12 ('base')",
446+
"display_name": "Python 3 (ipykernel)",
445447
"language": "python",
446448
"name": "python3"
447449
},
@@ -455,9 +457,8 @@
455457
"name": "python",
456458
"nbconvert_exporter": "python",
457459
"pygments_lexer": "ipython3",
458-
"version": "3.8.12"
459-
},
460-
"orig_nbformat": 4
460+
"version": "3.8.0"
461+
}
461462
},
462463
"nbformat": 4,
463464
"nbformat_minor": 2

examples/01_Exact_GPs/GP_Regression_DistributionalKernel.ipynb

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -10,10 +10,12 @@
1010
"\n",
1111
"In this notebook, we're going to demonstrate one way of dealing with uncertainty in our training data. Let's say that we're collecting training data that models the following function.\n",
1212
"\n",
13+
"$$\n",
1314
"\\begin{align}\n",
1415
"y &= \\sin(2\\pi x) + \\epsilon \\\\\n",
1516
" \\epsilon &\\sim \\mathcal{N}(0, 0.2) \n",
1617
"\\end{align}\n",
18+
"$$\n",
1719
"\n",
1820
"However, now assume that we're a bit uncertain about our features. In particular, we're going to assume that every `x_i` value is not a point but a distribution instead. E.g.\n",
1921
"\n",
@@ -780,9 +782,9 @@
780782
],
781783
"metadata": {
782784
"kernelspec": {
783-
"display_name": "Python 3.7.4 64-bit ('base': conda)",
785+
"display_name": "Python 3 (ipykernel)",
784786
"language": "python",
785-
"name": "python37464bitbaseconda52eab690427c4f7ea56588deee120c46"
787+
"name": "python3"
786788
},
787789
"language_info": {
788790
"codemirror_mode": {
@@ -794,7 +796,7 @@
794796
"name": "python",
795797
"nbconvert_exporter": "python",
796798
"pygments_lexer": "ipython3",
797-
"version": "3.7.4"
799+
"version": "3.8.0"
798800
}
799801
},
800802
"nbformat": 4,

examples/01_Exact_GPs/Simple_GP_Regression.ipynb

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -10,10 +10,12 @@
1010
"\n",
1111
"In this notebook, we demonstrate many of the design features of GPyTorch using the simplest example, training an RBF kernel Gaussian process on a simple function. We'll be modeling the function\n",
1212
"\n",
13+
"$$\n",
1314
"\\begin{align}\n",
1415
"y &= \\sin(2\\pi x) + \\epsilon \\\\\n",
1516
" \\epsilon &\\sim \\mathcal{N}(0, 0.04) \n",
1617
"\\end{align}\n",
18+
"$$\n",
1719
"\n",
1820
"with 100 training examples, and testing on 51 test examples.\n",
1921
"\n",
@@ -361,7 +363,7 @@
361363
"metadata": {
362364
"anaconda-cloud": {},
363365
"kernelspec": {
364-
"display_name": "Python 3",
366+
"display_name": "Python 3 (ipykernel)",
365367
"language": "python",
366368
"name": "python3"
367369
},
@@ -375,7 +377,7 @@
375377
"name": "python",
376378
"nbconvert_exporter": "python",
377379
"pygments_lexer": "ipython3",
378-
"version": "3.7.0"
380+
"version": "3.8.0"
379381
}
380382
},
381383
"nbformat": 4,

examples/02_Scalable_Exact_GPs/Simple_GP_Regression_CUDA.ipynb

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -18,10 +18,12 @@
1818
"\n",
1919
"In this notebook, we demonstrate many of the design features of GPyTorch using the simplest example, training an RBF kernel Gaussian process on a simple function. We'll be modeling the function\n",
2020
"\n",
21+
"$$\n",
2122
"\\begin{align}\n",
2223
" y &= \\sin(2\\pi x) + \\epsilon \\\\ \n",
2324
" \\epsilon &\\sim \\mathcal{N}(0, 0.2) \n",
2425
"\\end{align}\n",
26+
"$$\n",
2527
"\n",
2628
"with 11 training examples, and testing on 51 test examples."
2729
]
@@ -326,7 +328,7 @@
326328
"metadata": {
327329
"anaconda-cloud": {},
328330
"kernelspec": {
329-
"display_name": "Python 3",
331+
"display_name": "Python 3 (ipykernel)",
330332
"language": "python",
331333
"name": "python3"
332334
},
@@ -340,7 +342,7 @@
340342
"name": "python",
341343
"nbconvert_exporter": "python",
342344
"pygments_lexer": "ipython3",
343-
"version": "3.7.3"
345+
"version": "3.8.0"
344346
}
345347
},
346348
"nbformat": 4,

examples/04_Variational_and_Approximate_GPs/Approximate_GP_Objective_Functions.ipynb

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -214,10 +214,12 @@
214214
"The **predictive log likelihood** is an alternative to the variational ELBO that was proposed in [Jankowiak et al., 2020](https://arxiv.org/abs/1910.07123).\n",
215215
"It typically produces predictive variances than the `gpytorch.mlls.VariationalELBO` objective.\n",
216216
"\n",
217+
"$$\n",
217218
"\\begin{align}\n",
218219
"\\mathcal{L}_\\text{PLL} &= \\mathbb{E}_{p_\\text{data}( y, \\mathbf x )} \\left[ \\log p( y \\! \\mid \\! \\mathbf x) \\right] - \\beta \\: \\text{KL} \\left[ q( \\mathbf u) \\Vert p( \\mathbf u) \\right] \\\\ \n",
219220
" &\\approx \\sum_{i=1}^N \\log \\mathbb{E}_{q(\\mathbf u)} \\left[ \\int p( y_i \\! \\mid \\! f_i) p(f_i \\! \\mid \\! \\mathbf u) \\: d f_i \\right] - \\beta \\: \\text{KL} \\left[ q( \\mathbf u) \\Vert p( \\mathbf u) \\right] \n",
220221
"\\end{align}\n",
222+
"$$\n",
221223
"\n",
222224
"Note that this objective is *very similar* to the variational ELBO.\n",
223225
"The only difference is that the $\\log$ occurs *outside* the expectation $\\mathbb E_{q(\\mathbf u)}$.\n",

0 commit comments

Comments
 (0)