-
Notifications
You must be signed in to change notification settings - Fork 12
Example Usage and Tutorials
The deepImageJ case studies repository contains various examples demonstrating how to use deepImageJ for different bioimage analysis tasks. These case studies illustrate the application of deep learning models for image-to-image translation, nuclei segmentation, and integration with the BioImage Model Zoo. Each example includes scripts, macros, and detailed instructions to replicate the workflows. More details can be found in this article.
- Description: This case study demonstrates the use of deepImageJ for performing image-to-image translation from actin to DAPI images, followed by nuclei segmentation using the StarDist model. The pipeline is then integrated into a macro for batch processing demonstrating the ability of deepImageJ to run several models in a simple macro script.
-
Files:
- prepare_dataset.py: Script for preparing the dataset for image-to-image translation and segmentation.
- StarDist_Postprocess_macro_CS1.ijm: ImageJ Macro for StarDist post-processing in Case Study 1.
- Notebooks: Pix2Pix and StarDist 2D from ZeroCostDL4Mic.
- Dataset: Pix2Pix, Lifeact-RFP actin stain images and StarDist, sir-DNA dapi stain images
- Models: Pix2Pix for Image Translation from Lifeact-RFP to sir-DNA and StarDist Model for Nuclei Segmentation in Synthetic Lifeact-RFP
To fine-tune the models for your specific data, follow these steps:
-
Download Datasets:
- Obtain the two datasets required for this case study. If using Google Colab, upload these datasets to Google Drive. If fine-tuning locally, ensure the datasets are accessible on your local drive.
-
Fine-Tune Pix2Pix:
- Use the ZeroCostDL4Mic notebook for Pix2Pix to fine-tune the Pix2Pix model for 200 epochs with default parameters.
- Training parameters:
- Batch size: 1
- Loss function: Vanilla Generative Adversarial Network (GAN)
- Patch size: 512 × 512
- Initial learning rate: 2e-4
- Data augmentation: None
- The Pix2Pix model is exported using PyTorch 2.0.1.
-
Fine-Tune StarDist:
- Use the ZeroCostDL4Mic notebook for StarDist to fine-tune the StarDist model for 100 epochs.
- Training parameters:
- Dataset: 45 paired image patches (1024 × 1024)
- Patch size: 1024 × 1024
- Batch size: 2
- Initial learning rate: 3e-4
- Data augmentation: None
- The StarDist model is exported using TensorFlow 2.14.
-
Export Models:
- Ensure both fine-tuned models are exported in the BioImage Model Zoo format. Pay careful attention to the exporting and packaging requirements, including metadata.
Once the models are exported, follow these steps to install and use them in deepImageJ:
-
Install Models in deepImageJ:
- Open Fiji and navigate to
Plugins > deepImageJ > deepImageJ Install Model
. - Move to the
Private Model
tab, selectFrom ZIP file
, and add the path to your model zip file one by one.
- Open Fiji and navigate to
-
Run the Macro:
- Use the macro provided here.
- Modify the paths for the input and output folders in the macro to match your directories.
- The macro workflow includes:
- Running the fine-tuned Pix2Pix model through deepImageJ to perform image translation from actin to DAPI images.
- Running the fine-tuned StarDist model on the synthetic DAPI images.
- Performing StarDist post-processing steps.
-
Output:
- You will obtain a folder with the masks of your input images. If using the same data, expect five masks corresponding to the five time points.
To conclude this use case, you can use TrackMate for tracking and data visualization:
-
Load Time Points:
- Load the final five masks into TrackMate.
-
Perform Tracking:
- Follow the default options and documentation in TrackMate to perform the tracking analysis.
- Refer to the TrackMate documentation for detailed instructions.
By following these steps, you will successfully fine-tune and apply Pix2Pix and StarDist models using deepImageJ, and perform tracking analysis with TrackMate.
- Description: In this case study, deepImageJ is utilized for detailed 3D nuclei segmentation, showcasing its capability in handling complex volumetric data. The pipeline involves the generation of ground truth data, followed by the use of StarDist for nuclei segmentation. The pipeline is then integrated into a macro for batch processing.
-
Files:
- Generated_GT.py: Script for generating ground truth data for 3D nuclei segmentation.
- Mount_stardist_dataset.py: Script for setting up the StarDist dataset for 3D segmentation.
- StarDist_postprocess_macro_cs2.ijm: ImageJ Macro for StarDist post-processing in Case Study 2.
- Notebooks: StarDist 2D
- Dataset: Developing Tribolium Castaneum Embryo from Cell Tracking Challenge
- Model: StarDist Model for Developing Tribolium Castaneum Embryo
The dataset for this case study is downloaded from the Cell Tracking Challenge. It includes two different embryos. Embryo 01 is used for fine-tuning StarDist, and embryo 02 is reserved for the complete pipeline in Fiji.
-
Generate Ground Truth for Embryo 01:
- Annotations for embryo 01 are sparse. To create a training dataset containing partially annotated slices, run the
Generate_GT.py
script to extract these 2D slices into a new directory.
- Annotations for embryo 01 are sparse. To create a training dataset containing partially annotated slices, run the
-
Prepare Training and Testing Sets:
- Use the
mount_stardist_dataset.py
script to separate the extracted slices of embryo 01 intotrain/
andtest/
folders for fine-tuning in the notebook.
- Use the
-
Pre-process Datasets:
-
Noise reduction:
- Apply a median filter with a radius of 7.0 pixels to all images from embryos 01 and 02 (not masks) in Fiji through
Process > Filters > Median
.
- Apply a median filter with a radius of 7.0 pixels to all images from embryos 01 and 02 (not masks) in Fiji through
-
Reduce computational costs:
- Downsample images in x and y dimensions to half their size for both embryos. Additionally, downsample the number of slices in the z-axis for embryo 02 by reducing it by half.
-
Noise reduction:
After these steps, you will have two folders: one with embryo 01 ready for fine-tuning StarDist and another with embryo 02 ready for use and testing in Fiji.
-
Prepare the Dataset:
- Ensure you have a folder containing
train/
andtest/
subfolders for embryo 01.
- Ensure you have a folder containing
-
Fine-Tune StarDist:
- Use the same ZeroCostDL4Mic notebook as in Case Study 1 for StarDist to fine-tune the model with the new data.
- Training parameters:
- Epochs: 50
- Image Patches: 40 paired patches of size 512×512 cropped from the original images (1871×965 pixels)
- Batch Size: 15
- Loss Function: MAE
- Learning Rate: 5e-05
- Validation Data: 10%
- Number of Rays: 32
- Grid Parameter: 2
-
Export the Model:
- After successful training, export the model in the BioImage Model Zoo format.
-
Install the Model:
- Open Fiji and navigate to
Plugins > deepImageJ > deepImageJ Install Model
. - Select
From ZIP file
in thePrivate Model
tab and add the path to your model zip file.
- Open Fiji and navigate to
-
Run the Macro:
- Use the provided macro here to apply StarDist and post-processing steps.
- Modify the paths for input and output folders as necessary.
- The macro runs StarDist on each slice of the 3D volume of one time point of embryo 02 and outputs a folder containing a mask per slice.
-
Analyze the 3D Volume:
- Use the Connected Components plugin from MorphoLibJ to analyze the 3D volume for each time point.
- Apply connected components analysis over the entire stack to obtain a comprehensive analysis of the embryo.
-
Next Steps:
- Repeat the analysis for each time point of embryo 02 to create a 4D video, allowing you to analyze the movement and behavior of the embryo over time.
By following these steps, you will fine-tune and apply the StarDist model using deepImageJ and analyze the results using Connected Components in MorphoLibJ.
Case Study 3: Segmentation of Arabidopsis Apical Stem Cells and Integration with the BioImage Model Zoo in deepImageJ
- Description: This case study involves the segmentation of Arabidopsis apical stem cells and demonstrates the integration with the BioImage Model Zoo using deepImageJ.
- Dataset: Research data supporting Cell size and growth regulation in the Arabidopsis thaliana apical stem cell niche
-
Models: 3D Unet Arabidopsis Apical Stem Cells with
emotional-cricket
ID for the BioImage Model Zoo
This case study utilizes a model already available on the BioImage Model Zoo. Follow these steps:
-
Locate the Model:
- Visit the BioImage Model Zoo and search for the model under the
emotional-cricket
nickname.
- Visit the BioImage Model Zoo and search for the model under the
-
Download the Dataset:
- The dataset can be downloaded from the repository indicated above. Focus on the specific 3D volume needed for this case study.
- Navigate to
PNAS > PNAS > plant 13 > processed_tiffs
within the downloaded volume and select the file named84hrs_plant13_trim-acylYFP_improved.tif
.
-
Install the Model:
- Open Fiji and navigate to
Plugins > deepImageJ > deepImageJ Install Model
. - Install the model by selecting the appropriate zip file from the BioImage Model Zoo.
- Open Fiji and navigate to
-
Run the Model:
- Open the 3D volume (
84hrs_plant13_trim-acylYFP_improved.tif
) in Fiji. - Navigate to
Plugins > deepImageJ > deepImageJ Run
. - Choose the installed model and run the inference to obtain a mask for the segmented root of the plant.
- Open the 3D volume (
The post-processing pipeline includes two steps:
-
Gamma Correction:
- Apply a gamma correction with a value of 0.80 to enhance membrane visibility and reduce blurriness.
-
Morphological Segmentation:
- Use the Morphological Segmentation tool from MorphoLibJ for segmentation and visualization.
- Set the tolerance to 10 to effectively depict catchment and overlay basins on the segmented image.
This precise application of Morphological Segmentation ensures clear and distinct visualization of each cell.
By following these steps, you will be able to download and use the dataset and model from the BioImage Model Zoo, perform inference with deepImageJ , and apply post-processing steps in Fiji.
Introduction:
User Guide:
Model Developers Guide: