1
1
# Facies modeling with GANs
2
2
3
- Underground facies (= kind of rocks) modeling with GANs.
3
+ Underground facies (= kind of sediment) modeling with GANs.
4
+
5
+ ** Disclaimer** : This repository is a work in progress. This code aims to be the
6
+ official implementation of a not already published research paper. It provides
7
+ a complete code to train and evaluate some GANs for facies modeling as well as
8
+ unit tests and a small ready-to-use dataset (see below).
4
9
5
10
![ PythonVersion] ( https://img.shields.io/badge/python-3.7%20%7E%203.10-informational )
6
11
![ PytorchVersion] ( https://img.shields.io/badge/Pytorch-1.8%20%7E%201.12-blue )
7
- [ ![ License] ( https://img.shields.io/badge/license-MIT-white )] ( https://stringfixer.com/fr/MIT_license )
12
+ [ ![ License] ( https://img.shields.io/badge/license-MIT-white )] (
13
+ https://stringfixer.com/fr/MIT_license )
8
14
![ WandB] ( https://img.shields.io/badge/WandB-supported-brightgreen )
9
15
![ ClearML] ( https://img.shields.io/badge/ClearML-supported-brightgreen )
10
16
11
- [ ![ Flake8] ( https://github.com/valentingol/gan-facies-modeling/actions/workflows/flake.yaml/badge.svg )] ( https://github.com/valentingol/gan-facies-modeling/actions/workflows/flake.yaml )
12
- [ ![ Pydocstyle] ( https://github.com/valentingol/gan-facies-modeling/actions/workflows/pydocstyle.yaml/badge.svg )] ( https://github.com/valentingol/gan-facies-modeling/actions/workflows/pydocstyle.yaml )
13
- [ ![ MyPy] ( https://github.com/valentingol/gan-facies-modeling/actions/workflows/mypy.yaml/badge.svg )] ( https://github.com/valentingol/gan-facies-modeling/actions/workflows/mypy.yaml )
14
- [ ![ Isort] ( https://github.com/valentingol/gan-facies-modeling/actions/workflows/isort.yaml/badge.svg )] ( https://github.com/valentingol/gan-facies-modeling/actions/workflows/isort.yaml )
15
- [ ![ PyLint] ( https://img.shields.io/endpoint?url=https://gist.githubusercontent.com/valentingol/106c646ac67294657bccf02bbe22208f/raw/gan_facies_modeling_pylint.json )] ( https://github.com/valentingol/gan-facies-modeling/actions/workflows/pylint.yaml )
17
+ [ ![ Flake8] (
18
+ https://github.com/valentingol/gan-facies-modeling/actions/workflows/flake.yaml/badge.svg )] (
19
+ https://github.com/valentingol/gan-facies-modeling/actions/workflows/flake.yaml )
20
+ [ ![ Pydocstyle] (
21
+ https://github.com/valentingol/gan-facies-modeling/actions/workflows/pydocstyle.yaml/badge.svg )] (
22
+ https://github.com/valentingol/gan-facies-modeling/actions/workflows/pydocstyle.yaml )
23
+ [ ![ MyPy] (
24
+ https://github.com/valentingol/gan-facies-modeling/actions/workflows/mypy.yaml/badge.svg )] (
25
+ https://github.com/valentingol/gan-facies-modeling/actions/workflows/mypy.yaml )
26
+ [ ![ Isort] (
27
+ https://github.com/valentingol/gan-facies-modeling/actions/workflows/isort.yaml/badge.svg )] (
28
+ https://github.com/valentingol/gan-facies-modeling/actions/workflows/isort.yaml )
29
+ [ ![ PyLint] ( https://img.shields.io/endpoint?url=https://gist.githubusercontent.com/valentingol/106c646ac67294657bccf02bbe22208f/raw/gan_facies_modeling_pylint.json )] (
30
+ https://github.com/valentingol/gan-facies-modeling/actions/workflows/pylint.yaml )
16
31
17
32
---
18
33
19
34
Date: 2022-07-20
20
35
21
36
Author: [ github@Valentingol] ( https://github.com/valentingol )
22
37
38
+ [ ![ GitHub User followers] (
39
+ https://img.shields.io/github/followers/valentingol?label=Owner%20followers&style=social )] (
40
+ https://github.com/valentingol )
41
+ [ ![ GitHub User's User stars] (
42
+ https://img.shields.io/github/stars/valentingol?label=Owner%20Stars&style=social )] (
43
+ https://github.com/valentingol )
44
+
23
45
Work during my 5 months internship at [ IFPEN] ( https://www.ifpenergiesnouvelles.com/ ) ,
24
46
supervised by [ UVSQ] ( https://www.uvsq.fr/english )
25
47
and financed by [ DATAIA Paris Saclay] ( https://dataia.eu/en/dataia-paris-saclay-institute ) .
26
48
27
- [ ![ GitHub User followers] ( https://img.shields.io/github/followers/valentingol?label=Owner%20followers&style=social )] ( https://github.com/valentingol )
28
- [ ![ GitHub User's User stars] ( https://img.shields.io/github/stars/valentingol?label=Owner%20Stars&style=social )] ( https://github.com/valentingol )
49
+ Copyright © 2022 Goldite Valentin
50
+
51
+ MIT License ([ see here] ( LICENSE.md ) )
29
52
30
53
---
31
54
@@ -35,40 +58,22 @@ and financed by [DATAIA Paris Saclay](https://dataia.eu/en/dataia-paris-saclay-i
35
58
36
59
- ` sagan ` - Unconditional SAGAN (based on
37
60
[ Self-Attention Generative Adversarial Networks] ( https://arxiv.org/abs/1805.08318 )
38
- and [ Modeling of subsurface sedimentary facies using SAGANs] ( https://www.sciencedirect.com/science/article/abs/pii/S0920410522003540 ) )
61
+ and [ Modeling of subsurface sedimentary facies using SAGANs] (
62
+ https://www.sciencedirect.com/science/article/abs/pii/S0920410522003540 ))
39
63
40
64
- ` cond_sagan ` - Conditional SAGAN (based on
41
65
papers above
42
66
for SAGAN part and
43
- [ GANSim: Conditional Facies Simulation Using an Improved Progressive Growing of GANS] ( https://ideas.repec.org/p/osf/eartha/fm24b.html )
44
- for conditional part)
67
+ [ GANSim: Conditional Facies Simulation Using an Improved Progressive Growing of GANS] (
68
+ https://ideas.repec.org/p/osf/eartha/fm24b.html )
69
+ for conditional part) that also reconstructs input pixel maps
70
+
71
+ Note that you can disable self-attention in the configurations (DCGAN architecture).
45
72
46
73
** 3D Models:**
47
74
48
75
- soon 🚧
49
76
50
- ## Examples
51
-
52
- ### GANSim dataset
53
-
54
- | <img src =" ./assets/images/gansim_real.png " width =" 512 " >
55
- | :--:|
56
- | ** Real Images** (64 $\times$ 64)|
57
-
58
- | <img src =" ./assets/images/gansim_generated.png " width =" 512 " >
59
- | :--:|
60
- | ** Generated Images** (128 $\times$ 128)|
61
-
62
- ### Stanford-VI dataset (first part)
63
-
64
- | <img src =" ./assets/images/stanfordp1_real.png " width =" 512 " >
65
- | :--:|
66
- | ** Real Images** |
67
-
68
- | <img src =" ./assets/images/stanfordp1_generated.png " width =" 512 " >
69
- | :--:|
70
- | ** Generated Images** |
71
-
72
77
## Quick start
73
78
74
79
### Installation
@@ -78,78 +83,95 @@ Install the module and dependencies in a virtual environment with Python 3.7-3.1
78
83
``` bash
79
84
pip install -e .
80
85
pip install -r requirements.txt
81
- # for dev only:
86
+ # for developers only:
82
87
pip install -r requirements-dev.txt
83
88
```
84
89
85
- ### Train on a dataset
90
+ ### Train on the default dataset
86
91
87
92
A small dataset is available by default in this repository. It contains 2000
88
93
synthesized images representing some channels and 3 kind of facies and was
89
94
generated in the [ GANSim project] ( https://github.com/SuihongSong/GeoModeling_GANSim-2D_Condition_to_Well_Facies_and_Global_Features )
90
95
(under [ MIT license] ( ./assets/third_party_licenses/GANSim%20MIT%20LICENSE ) ).
91
96
More synthesized data are available
92
- [ here] ( https://zenodo.org/record/3993791#.X1FQuMhKhaR ) .
97
+ [ here] ( https://zenodo.org/record/3993791#.X1FQuMhKhaR ) . ** If you use this dataset
98
+ in your work, please cite the original authors.**
93
99
94
- Of course, you can put your own dataset in the ` datasets ` folder. The dataset
95
- should be a Numpy (` .npy ` ) file containing a 3D ndarray with format
96
- (z/depth/n_samples, y, x) of type ` uint8 ` with a different number for each
97
- facies, starting from 0. The number of facies is then ` dataset.max() + 1 ` .
98
-
99
- Now you can simply run a train on the default dataset with unconditional SAGAN
100
- model using the following command:
100
+ You can simply run a train on the default dataset with unconditional SAGAN
101
+ model using the following command in ` gan_facies ` folder:
101
102
102
103
``` bash
103
- python apps/train.py
104
+ python gan_facies/ apps/train.py
104
105
```
105
106
106
107
You can see the progress of the training in the terminal and the resulted
107
108
images and trained networks in the ` res ` folder.
108
109
109
- This repository contains a lot of configurations to customize the training and
110
- will be explained in the next section.
110
+ ## Use your own dataset
111
+
112
+ Of course, you can use your own dataset. Simply drop it in the ` datasets ` folder.
113
+ The dataset should be a Numpy file (` .npy ` ) containing a 3D ndarray with format
114
+ (z/depth/n_samples, y, x) of type ` uint8 ` with a different number for each
115
+ facies, starting from 0. The number of facies is then ` dataset.max() + 1 ` .
116
+ Now tou can run the training adding the ` --dataset_path=<mypath> ` argument.
117
+ You can also change the dataset path via configuration files. The next section
118
+ explains how to do that.
111
119
112
120
## Configurations
113
121
114
- Of course it is always interesting to customize the training with flexibility.
115
- Thus this repository use the smart configuration manager
116
- [ YAECS] ( https://github.com/valentingol/yaecs ) .
122
+ It is always interesting to customize the training with your own configurations.
123
+ This repository contains a lot of configuration organized in multiple sub-configurations.
124
+ The management of the configurations is simply done thanks to the smart configuration
125
+ manager [ YAECS] ( https://github.com/valentingol/yaecs ) .
117
126
118
127
The default sub-configurations (for models, training, ...) are organized in
119
- different sub-folders in ` configs/default ` . You can launch an experiment by
120
- writing an other configuration ` .yaml ` file that will be merged with the
121
- default one. Some examples are available in ` configs/exp ` . For example,
122
- this will override the default value of the name of the run and discriminator
128
+ different json files in ` configs/default ` . You can launch your own experiment by
129
+ writing a new ` .yaml ` file that will be merged with the default configuration.
130
+ Some examples are available in ` configs/exp ` . For example, the following file
131
+ will override the default value of the name of the run as well as discriminator
123
132
learning rate to 0.001:
124
133
125
134
``` yaml
126
- # configs/exp/my_experiment .yaml
135
+ # >> file ' configs/exp/my_config .yaml'
127
136
run_name : my_experiment
128
137
training.d_lr : 0.001
129
138
` ` `
130
139
131
140
Then you can run the experiment by adding the configuration in command line.
132
141
133
142
` ` ` bash
134
- python apps/train.py --config configs/exp/my_config.yaml
143
+ python gan_facies/ apps/train.py --config gan_facies/ configs/exp/my_config.yaml
135
144
```
136
145
137
146
* Note: The space between the ` --config ` and the configuration file is important.*
138
147
139
- You can also put instead a ** list** of configuration paths to merge together
140
- several experiment files (from the begin of the list to the end).
141
-
142
148
Moreover you can override parameters also by adding them in the ** command line** .
143
149
For example this will override the default configuration with your experiment
144
150
configuration, then set the generator learning rate to 0.001 and the generator
145
151
random input dimension to 64:
146
152
147
153
``` bash
148
- python apps/train.py --config configs/exp/my_config.yaml --training.g_lr=0.001 --model.z_dim=64
154
+ python gan_facies/apps/train.py --config gan_facies/configs/exp/my_config.yaml--training.g_lr=0.001\
155
+ --model.z_dim=64
149
156
```
150
157
151
158
* Note: The ` = ` between the ` --param ` and the value is important.*
152
159
160
+ To use conditional model you can check the ` configs/exp/conditional.yaml ` file
161
+ and adapt it to your needs. An other way is to use the merging of configurations
162
+ in cascade provided by yaecs. In fact, if you can put ** list** of configuration
163
+ file for ` --configs ` , they will be merge together (from the begin of the list
164
+ to the end). Example:
165
+
166
+ ``` batch
167
+ python gan_facies/apps/train.py --config [gan_facies/configs/exp/models/cond_sagan.yaml,gan_facies/configs/exp/my_config.yaml]
168
+ ```
169
+
170
+ First ` configs/exp/models/cond_sagan.yaml ` will be merged (changing model configuration)
171
+ then ` configs/exp/my_config ` (overwriting model configuration if needed).
172
+ You can create your own specific configurations (for data, models, metric, ...)
173
+ and merge as many of them as you want.
174
+
153
175
Finally, the configurations will be automatically saved (by default in ` res/configs ` )
154
176
to ensure that you can always recover the exact configuration used for the runs.
155
177
The "hierarchy of merging" is also saved to understand quickly how the configuration
@@ -165,11 +187,11 @@ more interesting parameters and many utilities to explore parameters space
165
187
(collaboratively or not), etc.
166
188
167
189
This repository allows to use WandB and ClearML very simply. You can check the default
168
- configuration implying WandB in ` configs/default/wandb .yaml ` and ClearML in
169
- ` configs/default/clearml.yaml ` . To use Wandb or ClearML you need to install
170
- one of them, create an account if you don't have one and set the configuration
171
- ` wandb.use_wandb: True ` or ` clearml.use_clearml: True ` in addition to the parameters
172
- for initialize the WandB run or ClearML task.
190
+ configuration implying WandB and ClearML in ` configs/default/experiment_tracking .yaml ` .
191
+ To use Wandb or ClearML you first need to install them, create an account if
192
+ you don't have one and set the configuration ` wandb.use_wandb: True ` or
193
+ ` clearml.use_clearml: True ` in addition to the parameters for initialize
194
+ the WandB run or ClearML task.
173
195
174
196
Plus, you can explore the parameters space using ` wandb.sweep ` . To do so, you
175
197
simply need to create a sweep config such as in ` configs/sweep/ex_sweep.yaml `
@@ -181,18 +203,42 @@ in `configs/exp`.
181
203
Note:
182
204
183
205
- It is currently not possible to use both ClearML and WandB at the same time
184
- - It is currently not possible to use hyperparameter search with ClearML in
185
- this repository (🚧)
206
+ - It is currently not possible to use hyperparameter search with ClearML
207
+ (only with wandb sweep). But we welcome any contribution to add this feature
208
+ (see [ ` CONTRIBUTE.md ` ] ( CONTRIBUTE.md ) )
209
+
210
+ ## Examples of generated images
211
+
212
+ ### GANSim dataset
213
+
214
+ | <img src =" ./assets/images/gansim_real.png " width =" 512 " >
215
+ | :--:|
216
+ | ** Real Images** (64 $\times$ 64)|
217
+
218
+ | <img src =" ./assets/images/gansim_generated.png " width =" 512 " >
219
+ | :--:|
220
+ | ** Generated Images** (128 $\times$ 128)|
221
+
222
+ ### Stanford-VI dataset (first part)
223
+
224
+ | <img src =" ./assets/images/stanfordp1_real.png " width =" 512 " >
225
+ | :--:|
226
+ | ** Real Images** |
227
+
228
+ | <img src =" ./assets/images/stanfordp1_generated.png " width =" 512 " >
229
+ | :--:|
230
+ | ** Generated Images** |
186
231
187
232
## TODO list
188
233
189
234
- [x] Add test for generator in ` apps/train.py `
190
235
- [x] Add generated results on GANSim dataset and
191
236
[ Stanford VI dataset] ( https://github.com/SCRFpublic/Stanford-VI-E/tree/master/Facies )
192
237
- [x] Add conditional SAGAN
238
+ - [ ] Add images generated by conditional model and metrics in README
193
239
- [ ] Add 3D models
194
- - [ ] Explore other architectures
195
240
196
241
## How to contribute
197
242
198
- Please have a look on [ CONTRIBUTE.md] ( ./CONTRIBUTE.md ) . Thank you very much! 🙏
243
+ We welcome any contribution to improve this repository. Please have a look on
244
+ [ CONTRIBUTE.md] ( ./CONTRIBUTE.md ) . Thank you very much! 🙏
0 commit comments