This repository contains python tools to perform bathymetric inversion based on deep learning algorithms. The deep learning algorithms (Unet and Pix2Pix) are implemented in tensorflow. During training, you have the choice to train on CPU or GPU if you have one.
- This project was built with python 3.8.13
- All the required dependencies are listed in the requirements.txt file.
-
Ensure python 3.8.13 is installed on your computer
-
Download or clone this repository at the location of your choice
-
Create a virtual environment with python 3.8.13 (this python version must be installed):
- virtualenv
virtualenv -p python3.8.13 [env name]
- conda
conda create --name [env name] python=3.8.13
-
Activate your virtual environment and download the required dependencies:
- virtualenv
source [env name]/bin/activate pip3 install -r requirements.txt
- conda
conda activate [env name] conda install --file requirements.txt
The general workflow is presented in the diagram below:
This processing step was developped specifically for Biarritz site. This are the steps to follow:
-
Define the settings for data preparation by modifying
./configs/Setting_data.py
. All the settings are explained in the python file. -
Launch the following command line:
python3 generate_dataset.py [metadata.csv] [Output_dir]
where
[metadata.csv]
is the file with all the metadata about the images. If not present indata_CNN/Data_processed
repository, it will be created and stored in this repository. The[Output_dir]
is the name of the repository where you want all the prepared data for the CNN training. The repository structure is specifically created to work with keras data loading procedures.
There are two ways to train the models:
-
With a unique configuration file:
-
Define the hyperparameters of the networks in the json file located in
./config
-
Launch the following command line:
python3 train.py [gpu] [Name of config file]
where
[gpu]
is an option to perform the trainng with (1) or without (0) a gpu. -
Unet and Pix2pix models will be trained. At the end of the training, all the output are saved in the following repository :
trained_models/
. The hyperameters and performances on the test set are saved inResults_test.csv
and the models are saved directly in the previously mentioned repository.
-
-
With several configuration files:
-
Create different configuration file in
./config
with the hyperparameters you want to test (same json format). -
Run the following bash command to train all the networks by iterating over the different config file in
./config
.bash train_configs.sh [env name]
-
Inspect the results in
trained_models/
.
-
The performances are stored on the test set are saved in this file : Results_test.csv
.
Detailed analysis of the predictions can be done with the following jupyter notebook: prediction_analysis.ipynb
.
Aurelien Callens
Inspiration, code snippets:
- Unet architecture, metrics: Collins et al.
- Training GAN in keras with .fit_generator() Daniel Möller (Stackoverflow)