Skip to content
/ HazeFlow Public

HazeFlow: Revisit Haze Physical Model as ODE and Non-Homogeneous Haze Generation for Real-World Dehazing [ICCV 2025]

Notifications You must be signed in to change notification settings

cloor/HazeFlow

Repository files navigation

HazeFlow : Revisit Haze Physical Model as ODE and Realistic Non-Homogeneous Haze Generation for Real-World Dehazing (ICCV2025)

Authors

Junseong Shin*, Seungwoo Chung*, Yunjeong Yang, Tae Hyun Kim

(* denotes equal contribution. denotes corresponding author.)

hazeflow

This is the official implementation of ICCV2025 "HazeFlow: Revisit Haze Physical Model as ODE and Realistic Non-Homogeneous Haze Generation for Real-World Dehazing" [paper] / [project page]

Results

result

More qualitative and quantitative results can be found on the [project page].

📦 Installation

git clone https://github.com/cloor/HazeFlow.git
cd HazeFlow
pip install -r requirements.txt

or

git clone https://github.com/cloor/HazeFlow.git
cd HazeFlow
conda env create -f environment.yaml

Checkpoints can be downloaded here.

🌫️ Haze Generation

mcbm
Figure: Example of non-homogeneous haze synthesized via MCBM. (a) Generated hazy image. (b) Transmission map TMCBM. (c) Spatially varying density coefficient map 𝛽̃.

You can generate haze density maps using MCBM by running the command below:

python haze_generation/brownian_motion_generation.py

🏋️ Training

📁 Dataset Preparation

Please download and organize the datasets as follows:

Dataset Description Download Link
RIDCP500 500 clear RGB images rgb_500 / da_depth_500
RTTS Real-world task-driven testing set Link
URHI Urban and rural haze images (duplicate-removed version) Link
HazeFlow/
├── datasets/
│   ├── RIDCP500/  
│   │   ├── rgb_500/
│   │   ├── da_depth_500/
│   │   ├── MCBM/
│   ├── RTTS/  
│   ├── URHI/           
│   └── custom/             

Before training, make sure the datasets are properly structured as shown above.
Additionally, prepare the MCBM-based haze density maps and corresponding depth maps.

To estimate depth maps, follow the instructions provided in the Depth Anything V2 repository and place the depth maps in the datasets/RIDCP500/da_depth_500/ directory.

Once depth maps are ready, you can proceed to training and inference as described below.

1. Pretrain Phase

We propose using a color loss to reduce color distortion.
You can configure the loss type by editing --config.training.loss_type in pretrain.sh.

sh pretrain.sh

2. Reflow Phase

Specify the pretrained checkpoint from the pretrain phase by editing --config.flow.pre_train_model in reflow.sh.

sh reflow.sh

3. Distillation Phase

Specify the checkpoint obtained from the reflow phase by editing --config.flow.pre_train_model in distill.sh.

sh distill.sh

Inference & Demo

To run inference on your own images, place them in the dataset/custom/ directory.

Then, configure the following options in sampling.sh:

  • --config.sampling.ckpt: path to your trained model checkpoint
  • --config.data.dataset: name of your dataset (rtts or custom)
  • --config.data.test_data_root: path to your input images

Finally, run:

sh sampling.sh

🔗 Acknowledgements

Our implementation is based on RectifiedFlow and SlimFlow. We sincerely thank the authors for their contributions to the community.

📚 Citation

If you use this code or find our work helpful, please cite our paper:

@article{shin2025hazeflow,
    title={HazeFlow: Revisit Haze Physical Model as ODE and Realistic Non-Homogeneous Haze Generation for Real-World Dehazing},
    author={Junseong Shin and Seungwoo Chung and Yunjeong Yang and Tae Hyun Kim},
    journal={ICCV},
    year={2025}
}

Contact

If you have any questions, please contact [email protected].

About

HazeFlow: Revisit Haze Physical Model as ODE and Non-Homogeneous Haze Generation for Real-World Dehazing [ICCV 2025]

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages