Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update adapters and converters #440

Draft
wants to merge 49 commits into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
49 commits
Select commit Hold shift + click to select a range
fbbfc57
Fix weight converters and return their corresponding v5 weight descr
thodkatz Aug 15, 2024
a37b568
Create an interface for weight conversion
thodkatz Aug 15, 2024
db891eb
fix import_callable annotation
FynnBe Dec 2, 2024
a391d94
improve error traceback for single weights format attempt
FynnBe Dec 2, 2024
4103b51
add load_state
FynnBe Dec 2, 2024
7ec7afb
update ONNXModelAdapter
FynnBe Dec 2, 2024
ed8f1db
update TorchscriptModelAdapter typing
FynnBe Dec 2, 2024
9ae626d
update unzipping in tensorflow model adapter
FynnBe Dec 3, 2024
fceed3c
add upper bounds to dependencies
FynnBe Dec 3, 2024
77e1e84
update dev envs
FynnBe Dec 3, 2024
a131369
WIP setup run expensive tests
FynnBe Dec 3, 2024
0888c52
WIP resource tests
FynnBe Dec 3, 2024
40dfe25
expose sha256 arg
FynnBe Dec 3, 2024
bb539d4
update torchscript adapter
FynnBe Dec 4, 2024
fedd43c
bump spec lib version
FynnBe Dec 4, 2024
b7d5f98
Merge remote-tracking branch 'thodkatz/weight-converters' into fix_to…
FynnBe Dec 4, 2024
7f6fdf1
WIP refactor backend libs
FynnBe Dec 5, 2024
7fea808
add summary_path arg
FynnBe Dec 5, 2024
4564f7c
update annotation
FynnBe Dec 5, 2024
00e6ba1
fix tf seeding
FynnBe Dec 5, 2024
5d1e2ce
expose test_description_in_conda_env
FynnBe Dec 5, 2024
df36d15
docstring formatting
FynnBe Dec 5, 2024
376507f
absorb test_description_in_conda_env into test_description
FynnBe Dec 6, 2024
a8a50ec
Merge branch 'install_conda_envs' into fix_torch_load
FynnBe Dec 6, 2024
9690574
all model adapters in backends
FynnBe Dec 6, 2024
f52a894
sort tests
FynnBe Dec 6, 2024
523c54b
add create_model_adapter
FynnBe Dec 9, 2024
d438a12
pin pyright
FynnBe Dec 9, 2024
f9a1a67
continue refactor of weight converters and backends
FynnBe Dec 9, 2024
80f9ed0
update test_weight_converters.py
FynnBe Dec 9, 2024
dad8186
add test_bioimageio_collection.py
FynnBe Dec 9, 2024
e835432
add onnx as dev dep
FynnBe Dec 9, 2024
459696d
add get_pre_and_postprocessing
FynnBe Dec 10, 2024
169cf17
use dim instead of deprecated dims arg name
FynnBe Dec 10, 2024
7d8e7fc
update tests
FynnBe Dec 11, 2024
8b2727e
add todo
FynnBe Dec 11, 2024
02252ac
udpate pytorch_to_onnx converter
FynnBe Dec 11, 2024
ab8616f
expose determinism to cli test command
FynnBe Dec 18, 2024
f11b428
WIP unify model adapters
FynnBe Dec 18, 2024
e5bbe7a
fix TorchscriptModelAdapter
FynnBe Dec 19, 2024
3720e85
update predict_sample_without_blocking
FynnBe Dec 19, 2024
76c27e9
ensure batch and channel axes have standardized id
FynnBe Dec 19, 2024
4cbfc5a
support validation context 'raise_errors'
FynnBe Dec 20, 2024
447409f
fix ONNXModelAdapter
FynnBe Dec 20, 2024
3b514f8
_get_axis_type ->_guess_axis_type
FynnBe Dec 20, 2024
b264331
fix get_axes_infos
FynnBe Dec 20, 2024
84f24fe
bump pyright version
FynnBe Dec 20, 2024
27ea9aa
add test cases
FynnBe Dec 20, 2024
de759d5
fix pip install with no-deps
FynnBe Jan 7, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
27 changes: 23 additions & 4 deletions .github/workflows/build.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,9 @@ jobs:
strategy:
matrix:
python-version: ['3.8', '3.9', '3.10', '3.11', '3.12']
include:
- python-version: '3.12'
run-expensive-tests: true
steps:
- uses: actions/checkout@v4
- name: Install Conda environment with Micromamba
Expand All @@ -39,6 +42,8 @@ jobs:
create-args: >-
python=${{ matrix.python-version }}
post-cleanup: 'all'
env:
PIP_NO_DEPS: true
- name: Install py3.8 environment
if: matrix.python-version == '3.8'
uses: mamba-org/setup-micromamba@v1
Expand All @@ -47,6 +52,8 @@ jobs:
cache-environment: true
environment-file: dev/env-py38.yaml
post-cleanup: 'all'
env:
PIP_NO_DEPS: true
- name: additional setup
run: pip install --no-deps -e .
- name: Get Date
Expand All @@ -63,6 +70,8 @@ jobs:
run: pytest --disable-pytest-warnings
env:
BIOIMAGEIO_CACHE_PATH: bioimageio_cache
RUN_EXPENSIVE_TESTS: ${{ matrix.run-expensive-tests && 'true' || 'false' }}


test-spec-main:
runs-on: ubuntu-latest
Expand All @@ -71,7 +80,8 @@ jobs:
python-version: ['3.8', '3.12']
include:
- python-version: '3.12'
is-dev-version: true
report-coverage: true
run-expensive-tests: true
steps:
- uses: actions/checkout@v4
- name: Install Conda environment with Micromamba
Expand All @@ -84,6 +94,8 @@ jobs:
create-args: >-
python=${{ matrix.python-version }}
post-cleanup: 'all'
env:
PIP_NO_DEPS: true
- name: Install py3.8 environment
if: matrix.python-version == '3.8'
uses: mamba-org/setup-micromamba@v1
Expand All @@ -92,6 +104,8 @@ jobs:
cache-environment: true
environment-file: dev/env-py38.yaml
post-cleanup: 'all'
env:
PIP_NO_DEPS: true
- name: additional setup spec
run: |
conda remove --yes --force bioimageio.spec || true # allow failure for cached env
Expand All @@ -112,17 +126,18 @@ jobs:
run: pytest --disable-pytest-warnings
env:
BIOIMAGEIO_CACHE_PATH: bioimageio_cache
- if: matrix.is-dev-version && github.event_name == 'pull_request'
RUN_EXPENSIVE_TESTS: ${{ matrix.run-expensive-tests && 'true' || 'false' }}
- if: matrix.report-coverage && github.event_name == 'pull_request'
uses: orgoro/[email protected]
with:
coverageFile: coverage.xml
token: ${{ secrets.GITHUB_TOKEN }}
- if: matrix.is-dev-version && github.ref == 'refs/heads/main'
- if: matrix.report-coverage && github.ref == 'refs/heads/main'
run: |
pip install genbadge[coverage]
genbadge coverage --input-file coverage.xml --output-file ./dist/coverage/coverage-badge.svg
coverage html -d dist/coverage
- if: matrix.is-dev-version && github.ref == 'refs/heads/main'
- if: matrix.report-coverage && github.ref == 'refs/heads/main'
uses: actions/upload-artifact@v4
with:
name: coverage
Expand All @@ -147,6 +162,8 @@ jobs:
create-args: >-
python=${{ matrix.python-version }}
post-cleanup: 'all'
env:
PIP_NO_DEPS: true
- name: additional setup spec
run: |
conda remove --yes --force bioimageio.spec || true # allow failure for cached env
Expand Down Expand Up @@ -184,6 +201,8 @@ jobs:
create-args: >-
python=${{ matrix.python-version }}
post-cleanup: 'all'
env:
PIP_NO_DEPS: true
- name: additional setup
run: pip install --no-deps -e .
- name: Get Date
Expand Down
6 changes: 6 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -375,6 +375,12 @@ The model specification and its validation tools can be found at <https://github

## Changelog

### 0.7.1 (to be released)

- New feature: `bioimageio.core.test_description` accepts **runtime_env** and **run_command** to test a resource
using the conda environment described by that resource (or another specified conda env)
- raise validation errors if `ValidationContext.raise_errors is True`

### 0.7.0

- breaking:
Expand Down
2 changes: 2 additions & 0 deletions bioimageio/core/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,7 @@
)
from ._settings import settings
from .axis import Axis, AxisId
from .backends import create_model_adapter
from .block_meta import BlockMeta
from .common import MemberId
from .prediction import predict, predict_many
Expand Down Expand Up @@ -73,6 +74,7 @@
"commands",
"common",
"compute_dataset_measures",
"create_model_adapter",
"create_prediction_pipeline",
"digest_spec",
"dump_description",
Expand Down
31 changes: 4 additions & 27 deletions bioimageio/core/_prediction_pipeline.py
Original file line number Diff line number Diff line change
Expand Up @@ -121,19 +121,9 @@ def predict_sample_block(
self.apply_preprocessing(sample_block)

output_meta = sample_block.get_transformed_meta(self._block_transform)
output = output_meta.with_data(
{
tid: out
for tid, out in zip(
self._output_ids,
self._adapter.forward(
*(sample_block.members.get(t) for t in self._input_ids)
),
)
if out is not None
},
stat=sample_block.stat,
)
local_output = self._adapter.forward(sample_block)

output = output_meta.with_data(local_output.members, stat=local_output.stat)
if not skip_postprocessing:
self.apply_postprocessing(output)

Expand All @@ -152,20 +142,7 @@ def predict_sample_without_blocking(
if not skip_preprocessing:
self.apply_preprocessing(sample)

output = Sample(
members={
out_id: out
for out_id, out in zip(
self._output_ids,
self._adapter.forward(
*(sample.members.get(in_id) for in_id in self._input_ids)
),
)
if out is not None
},
stat=sample.stat,
id=sample.id,
)
output = self._adapter.forward(sample)
if not skip_postprocessing:
self.apply_postprocessing(output)

Expand Down
Loading
Loading