-
Notifications
You must be signed in to change notification settings - Fork 24
Datasets
Norman Rzepka edited this page Mar 23, 2018
·
9 revisions
-
webknossos-wrap (WKW). Optimized format for large datasets of 3D voxel imagery. Supports compression, efficient cutouts, multi-channel and several base datatypes (e.g.,
uint8
,uint16
). - KNOSSOS cubes. Dataset of 128x128x128 cubes.
- Grayscale data (
uint8
), also referred to ascolor
data - RGB data (24 Bit)
- Segmentation data (8 Bit, 16 Bit, 32 Bit)
- Multi-channel data (multiple 8 Bit)
Of course datasets do not need to be created manually. The webKnossos cuber converts image stacks and KNOSSOS cubes into a WKW dataset. It also compresses datasets for efficient file storage and creates necessary metadata.
test_dataset # One folder per dataset
├─ color # Dataset layer (e.g., color, segmentation)
│ ├─ 1 # Magnification step (1, 2, 4, 8, 16 etc.)
│ │ ├─ header.wkw # Header wkw file
│ │ ├─ z0
│ │ │ ├─ y0
│ │ │ │ ├─ x0.wkw # Actual data wkw file
│ │ │ │ └─ x1.wkw # Actual data wkw file
│ │ │ └─ y1/...
│ │ └─ z1/...
│ └─ 2/...
├─ segmentation/...
└─ datasource-properties.json # Dataset metadata (will be created upon import, if non-existant)
Sample datasource-properties.json
:
{
"id" : {
"name" : "test_dataset",
"team" : "<unknown>"
},
"dataLayers" : [ {
"name" : "color",
"category" : "color",
"boundingBox" : {
"topLeft" : [ 0, 0, 0 ],
"width" : 1024,
"height" : 1024,
"depth" : 1024
},
"wkwResolutions" : [ {
"resolution" : 1,
"cubeLength" : 1024
}, {
"resolution" : 2,
"cubeLength" : 1024
} ],
"elementClass" : "uint8",
"dataFormat" : "wkw"
}, {
"name" : "segmentation",
"boundingBox" : {
"topLeft" : [ 0, 0, 0 ],
"width" : 1024,
"height" : 1024,
"depth" : 1024
},
"wkwResolutions" : [ {
"resolution" : 1,
"cubeLength" : 1024
}, {
"resolution" : 2,
"cubeLength" : 1024
} ],
"elementClass" : "uint32",
"largestSegmentId" : 1000000000,
"category" : "segmentation",
"dataFormat" : "wkw"
} ],
"scale" : [ 11.24, 11.24, 28 ]
}
Use the upload function in webKnossos:
- Recommended only for smaller datasets (max. 1 GB)
- Create a zip-file of dataset as specified above
File-system import:
- Place the dataset at
<webKnossos directory>/binaryData/<Organization name>/<Dataset name>
- Wait for webKnossos to detect the dataset (up to 10min)
- Go to the dataset view on the dashboard
- Click
Import
for your new dataset - Review, correct and approve the dataset metadata. Usually,
scale
andlargestSegmentId
needs to be set manually.
KNOSSOS datasets can be reused in webKnossos with slight modifications:
- The folders of the magnification steps have a different naming scheme:
mag1
->1
,mag2
->2
- webKnossos compatible metadata needs to be created (i.e.,
datasource-properties.json
).knossos.conf
can not be used directly. - Make sure to put each dataset layer into its own directory structure (e.g.,
color
,segmentation
)