Skip to content

Commit 3a610d8

Browse files
committed
Adjusting README and files according to JOSS reviewers
1 parent 754e98c commit 3a610d8

File tree

7 files changed

+66
-29
lines changed

7 files changed

+66
-29
lines changed

M_rate16.npy

64.8 KB
Binary file not shown.

README.md

Lines changed: 56 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -2,41 +2,85 @@
22

33
Welcome to the MACE repository!
44

5-
***MACE, a Machine-learning Approach to Chemistry Emulation***, by [Maes et al. (2024)](https://ui.adsabs.harvard.edu/abs/2024arXiv240503274M/abstract), is a surrogate model for chemical kinetics. It is developed in the contexts of circumstellar envelopes (CSEs) of asymptotic giant branch (AGB) stars, i.e. evolved low-mass stars.
5+
***MACE, a Machine-learning Approach to Chemistry Emulation***, by [Maes et al. (2024)](https://ui.adsabs.harvard.edu/abs/2024arXiv240503274M/abstract), is a surrogate model for chemical kinetics. It is developed in the contexts of circumstellar envelopes (CSEs) of asymptotic giant branch (AGB) stars, i.e. evolved low-mass stars.
66

7-
MACE is implemented in Python and uses [PyTorch](https://pytorch.org/), together with [torchode](https://github.com/martenlienen/torchode) [(Lienen & Gunnemann, 2022)](https://openreview.net/pdf?id=uiKVKTiUYB0), to be trained.
7+
During development, the chemical models of [Maes et al. (2023)](https://ui.adsabs.harvard.edu/abs/2023MNRAS.522.4654M/abstract) are used. In this paper you can also find more details about the astrochemical environment used.
8+
9+
MACE is implemented in Python and is trained using [PyTorch](https://pytorch.org/), together with [torchode](https://github.com/martenlienen/torchode) [(Lienen & Gunnemann, 2022)](https://openreview.net/pdf?id=uiKVKTiUYB0).
10+
11+
---
12+
## Table of content
13+
- [Installation](#inst)
14+
- [What is MACE?](#what)
15+
- [How to use?](#use)
16+
- [Example case](#exmp)
17+
- [Contact](#cont)
18+
- [Acknowledgements](#ackn)
819

920
---
10-
## Notes on installation
11-
- MACE is not available on ```pypi```, the package named ```mace``` is not this one.
21+
## Notes on installation <a name="inst"></a>
22+
- MACE is currently not available as a package on ```pypi```. There is a package named ```mace```, but it is not this one.
1223
- To use MACE, please clone the repo and install the required packages, see ```requirements.txt```:
1324
```
1425
git clone https://github.com/silkemaes/MACE.git
1526
```
1627

17-
1828
---
19-
## What?
29+
## What is MACE? <a name="what"></a>
30+
31+
MACE offers a surrogate model that emulates the evolution of chemical abundances over time in a dynamical physical environment. As the name states, it makes use of machine-learning techniques. More specifically, combining an *autoencoder* (blue) and a *trainable ordinary differential equation (ODE)* (red) allows to accurately emulate a chemical kinetics model.
32+
33+
Hence, MACE is a framework, an architecture, that can be trained for specific chemical datasets, but before using, should be made compatible with the dataset, see _[How to use?](#use)_.
2034

2135
The architecture of MACE is schematically given as
2236
![MACE architecture](MACE.png)
2337

24-
MACE offers a surrogate model that emulates the evolution of chemical abundances over time in a dynamical physical environment. As the name states, it makes use of machine learning techniques. More specifically, combining an *autoencoder* (blue) and a *trainable ordinary differential equation (ODE)* (red) allows to accurately emulate a chemical kinetics model.
38+
MACE offers a surrogate model that emulates the evolution of chemical abundances over time in a dynamical physical environment. As the name states, it makes use of machine-learning techniques. More specifically, combining an *autoencoder* (blue) and a *trainable ordinary differential equation (ODE)* (red) allows to accurately emulate a chemical kinetics model.
2539

2640
In formula, MACE is stated as
27-
$${\hat{\boldsymbol{n}}}(t) = \mathcal{D}\Big( G \big( \mathcal{E} ({\boldsymbol{n}}, {\boldsymbol{p}}),t \big) \Big).$$
28-
Here, ${\hat{\boldsymbol{n}}}(t)$ are the predicted chemical abundances at a time $t$ later dan the initial state ${\boldsymbol{n}}$ . $\mathcal{E}$ and $\mathcal{D}$ represent the autoecoder, with the encoder and decoder, respectively. The autoencoder maps the chemical space ${\boldsymbol{n}}$ together with the physical space ${\boldsymbol{p}}$ to a lower dimensional representation $\boldsymbol{z}$, called the latent space. The function $G$ describes the evolution in latent space such that $\boldsymbol{z}(\Delta t) = G(\boldsymbol{z}, \Delta t)=\int_0^{\Delta t} g(\boldsymbol{z}){\rm d}t$.
41+
42+
$$
43+
{\hat{\boldsymbol{n}}}(t) = \mathcal{D}\Big( G \big( \mathcal{E} ({\boldsymbol{n}}, {\boldsymbol{p}}),t \big) \Big).
44+
$$
45+
46+
Here, ${\hat{\boldsymbol{n}}}(t)$ are the predicted chemical abundances at a time $t$ later dan the initial state ${\boldsymbol{n_0}}$. $\mathcal{E}$ and $\mathcal{D}$ represent the autoecoder, with the encoder and decoder, respectively. The autoencoder maps the chemical space ${\boldsymbol{n_0}}$ together with the physical space ${\boldsymbol{p}}$ to a lower dimensional representation $\boldsymbol{z}$, called the latent space. The function $G$ describes the evolution in latent space such that $\boldsymbol{z}(\Delta t) = G(\boldsymbol{z}, \Delta t)=\int_0^{\Delta t} g(\boldsymbol{z}){\rm d}t$.
2947

3048
For more details, check out our paper: [Maes et al. (2024)](https://ui.adsabs.harvard.edu/abs/2024arXiv240503274M/abstract).
3149

3250
---
33-
## How to run?
51+
## How to use? <a name="use"></a>
52+
53+
The script ```routine.py``` gives the flow of training & storing a MACE architecture, and immediately applies to the specified test dataset once training is finished. As such, it returns an averaged error on the MACE model compared to the classical model. More info on the training routine can be found in the [paper](https://ui.adsabs.harvard.edu/abs/2024arXiv240503274M/abstract).
54+
55+
An annotated notebook of the routine can be found in the [documentation](https://mace-code.readthedocs.io/en/latest/example/run.html).
56+
57+
The script ```routine.py``` takes an input file with the needed (hyper)parameter setup. An example of such an input file can be found in input/.
58+
```
59+
python routine.py example
60+
```
61+
62+
***Disclaimer:***
63+
64+
In order to train MACE with a certain chemical dataset, the ```Dataset``` class
65+
should be made compatible with that data. Currently, the script ```src/mace/CSE_0D/dataset.py``` works only for the specific dataset used here, i.e. models from [Maes et al. (2023)](https://ui.adsabs.harvard.edu/abs/2023MNRAS.522.4654M/abstract), using the [Rate22-CSE code](https://github.com/MarieVdS/rate22_cse_code).
66+
67+
68+
---
69+
## Example case <a name="exmp"></a>
70+
71+
This repository contains a trained MACE model as a test case, see ```model/20240604_160152```.
3472

35-
Once the Dataset class is set up properly (see src/mace/CSE_0D/dataset.py), a MACE model can be trained. This can be done using the script 'run.py', which takes an input file with the needed (hyper)parameter setup. An example of such an input file can be found in input/.
73+
The code for loading a trained MACE model can be found in the script ```src/mace/load.py```, testing in ```src/mace/test.py```. An annotated notebook can be found in the [documentation](https://mace-code.readthedocs.io/en/latest/example/load%26test.html).
3674

37-
The script run.py trains the model, as explained by [Maes et al. (2024)](https://ui.adsabs.harvard.edu/abs/2024arXiv240503274M/abstract), and is immediately applied to the specified test dataset once training is finished. As such, it returns an averaged error on the MACE model compared to the classical model.
75+
---
76+
## Contact <a name="cont"></a>
3877

78+
If any comments or issues come up, please contact me via [email](mailto:[email protected]), or set up a GitHub issue.
79+
80+
---
81+
## Acknowledgements <a name="ackn"></a>
3982

83+
The MACE architecture is free to use. Please cite our paper [Maes et al. (2024)](https://ui.adsabs.harvard.edu/abs/2024arXiv240503274M/abstract)
4084

4185

4286

File renamed without changes.

input/example.in

Lines changed: 0 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -25,11 +25,3 @@ elm = 0
2525

2626

2727

28-
29-
Name = 20240604_133421
30-
31-
Name = 20240604_133421
32-
33-
Name = 20240604_133538
34-
35-
Name = 20240604_133932

rate16.specs

Whitespace-only changes.

run.py renamed to routine.py

Lines changed: 9 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,11 @@
1-
import matplotlib.pyplot as plt
2-
import numpy as np
1+
import matplotlib.pyplot as plt
2+
import numpy as np
33
import sys
44
import torch
5-
from time import time
6-
import datetime as dt
7-
from tqdm import tqdm
5+
from time import time
6+
import datetime as dt
7+
from tqdm import tqdm
8+
import os
89

910
import src.mace.CSE_0D.dataset as ds
1011
import src.mace.train as train
@@ -15,14 +16,14 @@
1516
import src.mace.utils as utils
1617
from src.mace.input import Input
1718

18-
19+
source_dir = os.path.dirname(os.path.abspath(__file__))
1920

2021
specs_dict, idx_specs = utils.get_specs()
2122

2223
start = time()
2324
now = dt.datetime.now()
2425
name = str(now.strftime("%Y%m%d")+'_'+now.strftime("%H%M%S"))
25-
path = '/STER/silkem/MACE/models/CSE_0D/'+name
26+
path = source_dir+'/models/CSE_0D/'+name
2627

2728

2829
## ================================================== INPUT ========
@@ -31,7 +32,7 @@
3132
## READ INPUT FILE
3233
arg = sys.argv[1]
3334

34-
infile = '/STER/silkem/MACE/input/'+arg+'.in'
35+
infile = source_dir+'/input/'+arg+'.in'
3536

3637
input = Input(infile, name)
3738

src/mace/loss.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@ def __init__(self, norm, fract, losstype):
4646

4747
parentpath = str(Path(__file__).parent)[:-15]
4848

49-
self.M = np.load(parentpath+'data/M_rate16.npy')
49+
self.M = np.load(parentpath+'M_rate16.npy')
5050

5151
## initialise
5252
self.set_losstype(losstype)

0 commit comments

Comments
 (0)