Skip to content

Commit

Permalink
Merge pull request #98 from sbesson/json_snippets
Browse files Browse the repository at this point in the history
Extract specification snippets as standalone JSON files
  • Loading branch information
sbesson authored Mar 18, 2022
2 parents 047bc0b + 51ccdd9 commit 8dec691
Show file tree
Hide file tree
Showing 3 changed files with 73 additions and 42 deletions.
54 changes: 54 additions & 0 deletions 0.4/examples/valid_strict/multiscales_example.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,54 @@
{
"multiscales": [
{
"version": "0.4",
"name": "example",
"axes": [
{"name": "t", "type": "time", "unit": "millisecond"},
{"name": "c", "type": "channel"},
{"name": "z", "type": "space", "unit": "micrometer"},
{"name": "y", "type": "space", "unit": "micrometer"},
{"name": "x", "type": "space", "unit": "micrometer"}
],
"datasets": [
{
"path": "0",
"coordinateTransformations": [{
// the voxel size for the first scale level (0.5 micrometer)
"type": "scale",
"scale": [1.0, 1.0, 0.5, 0.5, 0.5]
}]
},
{
"path": "1",
"coordinateTransformations": [{
// the voxel size for the second scale level (downscaled by a factor of 2 -> 1 micrometer)
"type": "scale",
"scale": [1.0, 1.0, 1.0, 1.0, 1.0]
}]
},
{
"path": "2",
"coordinateTransformations": [{
// the voxel size for the third scale level (downscaled by a factor of 4 -> 2 micrometer)
"type": "scale",
"scale": [1.0, 1.0, 2.0, 2.0, 2.0]
}]
}
],
"coordinateTransformations": [{
// the time unit (0.1 milliseconds), which is the same for each scale level
"type": "scale",
"scale": [0.1, 1.0, 1.0, 1.0, 1.0]
}],
"type": "gaussian",
"metadata": {
"description": "the fields in metadata depend on the downscaling implementation. Here, the parameters passed to the skimage function are given",
"method": "skimage.transform.pyramid_gaussian",
"version": "0.16.1",
"args": "[true]",
"kwargs": {"multichannel": true}
}
}
]
}
54 changes: 15 additions & 39 deletions 0.4/index.bs
Original file line number Diff line number Diff line change
Expand Up @@ -92,6 +92,16 @@ of bioimaging data, whether during acquisition or sharing in the cloud.
Note: The following text makes use of OME-Zarr [[ome-zarr-py]], the current prototype implementation,
for all examples.

Document conventions
--------------------

The key words “MUST”, “MUST NOT”, “REQUIRED”, “SHALL”, “SHALL NOT”, “SHOULD”, “SHOULD NOT”,
“RECOMMENDED”, “MAY”, and “OPTIONAL” are to be interpreted as described in
[RFC 2119](https://tools.ietf.org/html/rfc2119).

Some of the JSON examples in this document include commments. However, these are only for
clarity purposes and comments MUST NOT be included in JSON objects.

On-disk (or in-cloud) layout {#on-disk}
=======================================

Expand Down Expand Up @@ -285,45 +295,11 @@ Each "multiscales" dictionary SHOULD contain the field "name". It SHOULD contain
Each "multiscales" dictionary SHOULD contain the field "type", which gives the type of downscaling method used to generate the multiscale image pyramid.
It SHOULD contain the field "metadata", which contains a dictionary with additional information about the downscaling method.

```
{
"multiscales": [
{
"version": "0.4",
"name": "example",
"axes": [
{"name": "t", "type": "time", "unit": "millisecond"},
{"name": "c", "type": "channel"},
{"name": "z", "type": "space", "unit": "micrometer"},
{"name": "y", "type": "space", "unit": "micrometer"},
{"name": "x", "type": "space", "unit": "micrometer"}
],
"datasets": [
{
"path": "0",
"coordinateTransformations": [{"type": "scale", "scale": [1.0, 1.0, 0.5, 0.5, 0.5]}] # the voxel size for the first scale level (0.5 micrometer)
}
{
"path": "1",
"coordinateTransformations": [{"type": "scale", "scale": [1.0, 1.0, 1.0, 1.0, 1.0]}] # the voxel size for the second scale level (downscaled by a factor of 2 -> 1 micrometer)
},
{
"path": "2",
"coordinateTransformations": [{"type": "scale", "scale": [1.0, 1.0, 2.0, 2.0, 2.0]}] # the voxel size for the second scale level (downscaled by a factor of 4 -> 2 micrometer)
}
],
"coordinateTransformations": [{"type": "scale", "scale": [0.1, 1.0, 1.0, 1.0, 1.0]], # the time unit (0.1 milliseconds), which is the same for each scale level
"type": "gaussian",
"metadata": { # the fields in metadata depend on the downscaling implementation
"method": "skimage.transform.pyramid_gaussian", # here, the paramters passed to the skimage function are given
"version": "0.16.1",
"args": "[true]",
"kwargs": {"multichannel": true}
}
}
]
}
```
<pre class=include-code>
path: examples/valid_strict/multiscales_example.json
highlight: json
</pre>


If only one multiscale is provided, use it. Otherwise, the user can choose by
name, using the first multiscale as a fallback:
Expand Down
7 changes: 4 additions & 3 deletions 0.4/tests/test_validation.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,9 +35,10 @@ def ids(files):
"testfile", valid_strict_files, ids=ids(valid_strict_files))
def test_valid_strict(testfile):
with open(testfile) as f:
json_file = json.load(f)
validator.validate(json_file)
strict_validator.validate(json_file)
data = ''.join(line for line in f if not line.lstrip().startswith('//'))
jsondata = json.loads(data)
validator.validate(jsondata)
strict_validator.validate(jsondata)


@pytest.mark.parametrize("testfile", valid_files, ids=ids(valid_files))
Expand Down

0 comments on commit 8dec691

Please sign in to comment.