You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: src/neps/optimizers/README.md
+69-16Lines changed: 69 additions & 16 deletions
Original file line number
Diff line number
Diff line change
@@ -8,41 +8,94 @@ If you prefer not to specify a particular optimizer for your AutoML task, you ca
8
8
9
9
The optimizer selection is based on the following characteristics of your search space:
10
10
11
-
- If it has fidelity: hyperband
12
-
- If it has a prior: pibo
13
-
- If it has both fidelity and a prior: priorband
14
-
- If it has neither: bayesian_optimization
11
+
- If it has fidelity: `hyperband`
12
+
- If it has both fidelity and a prior: `priorband`
13
+
- If it has a prior: `pibo`
14
+
- If it has neither: `bayesian_optimization`
15
+
16
+
For example, running the following format, without specifying a searcher will choose an optimizer depending on the `pipeline_space` passed.
17
+
```python
18
+
neps.run(
19
+
run_pipeline=run_function,
20
+
pipeline_space=pipeline_space,
21
+
root_directory="results/",
22
+
max_evaluations_total=25,
23
+
# no searcher specified
24
+
)
25
+
```
15
26
16
27
### 2. Choosing one of NePS Optimizers
17
28
18
-
We have also prepared some optimizers with specific hyperparameters that we believe can generalize well to most AutoML tasks and use cases. For more details on the available default optimizers and the algorithms that can be called, please refer to the next section on SearcherConfigs.
29
+
We have also prepared some optimizers with specific hyperparameters that we believe can generalize well to most AutoML tasks and use cases. For more details on the available default optimizers and the algorithms that can be called, please refer to the next section on [SearcherConfigs](#Searcher-Configurations).
30
+
31
+
```python
32
+
neps.run(
33
+
run_pipeline=run_function,
34
+
pipeline_space=pipeline_space,
35
+
root_directory="results/",
36
+
max_evaluations_total=25,
37
+
# searcher specified, along with an argument
38
+
searcher="bayesian_optimization",
39
+
initial_design_size=5,
40
+
)
41
+
```
42
+
43
+
For more optimizers, please refer [here](#List-Available-Searchers) .
19
44
20
45
### 3. Custom Optimizer Configuration via YAML
21
46
22
47
For users who want more control over the optimizer's hyperparameters, you can create your own YAML configuration file. In this file, you can specify the hyperparameters for your preferred optimizer. To use this custom configuration, provide the path to your YAML file using the `searcher_path` parameter when running the optimizer. The library will then load your custom settings and use them for optimization.
23
48
24
-
Here's the format of the YAML configuration using `Bayesian Optimization` as an example:
49
+
Here's the format of a custom YAML (`custom_bo.yaml`) configuration using `Bayesian Optimization` as an example:
25
50
26
51
```yaml
27
52
searcher_init:
28
53
algorithm: bayesian_optimization
29
-
searcher_kwargs: # Specific arguments depending on the searcher
30
-
initial_design_size: 5
31
-
surrogate_model: gp_hierarchy # or {"gp_hierarchy", "deep_gp"}
32
-
acquisition: EI# or {"LogEI", "AEI", "MFEI"}
54
+
searcher_kwargs: # Specific arguments depending on the searcher
55
+
initial_design_size: 7
56
+
surrogate_model: gp
57
+
acquisition: EI
33
58
log_prior_weighted: false
34
-
acquisition_sampler: random# or {"mutation", "evolution", "freeze-thaw"}
35
-
random_interleave_prob: 0.0
59
+
acquisition_sampler: random
60
+
random_interleave_prob: 0.1
36
61
disable_priors: false
37
62
prior_confidence: high
38
-
sample_default_first: true
63
+
sample_default_first: false
64
+
```
65
+
66
+
```python
67
+
neps.run(
68
+
run_pipeline=run_function,
69
+
pipeline_space=pipeline_space,
70
+
root_directory="results/",
71
+
max_evaluations_total=25,
72
+
# searcher specified, along with an argument
73
+
searcher_path = "custom/path/to/directory"
74
+
# `custom_bo.yaml` should be in `searcher_path`
75
+
searcher="custom_bo",
76
+
)
39
77
```
40
78
41
79
### 4. Hyperparameter Overrides
42
80
43
81
If you want to make on-the-fly adjustments to the optimizer's hyperparameters without modifying the YAML configuration file, you can do so by passing keyword arguments (kwargs) to the neps.run function itself. This enables you to fine-tune specific hyperparameters without the need for YAML file updates. Any hyperparameter values provided as kwargs will take precedence over those specified in the YAML configuration.
44
82
45
-
### Note for Contributors
83
+
```python
84
+
neps.run(
85
+
run_pipeline=run_function,
86
+
pipeline_space=pipeline_space,
87
+
root_directory="results/",
88
+
max_evaluations_total=25,
89
+
# searcher specified, along with an argument
90
+
searcher_path="custom/path/to/directory"
91
+
# `custom_bo.yaml` should be in `searcher_path`
92
+
searcher="custom_bo",
93
+
initial_design_size=5, # overrides value in custom_bo.yaml
94
+
random_interleave_prob: 0.25# overrides value in custom_bo.yaml
95
+
)
96
+
```
97
+
98
+
## Note for Contributors
46
99
47
100
When designing a new optimizer, it's essential to create a YAML configuration file in the `default_searcher` folder under `neps.src.optimizers`. This YAML file should contain the default configuration settings that you believe should be used when the user chooses the searcher.
If you want to identify which NePS searchers are using a specific searching algorithm (e.g., Bayesian Optimization, Hyperband, PriorBand...), you can use the `get_searcher_from_alg` function. It returns a list of searchers utilizing the specified algorithm:
138
+
If you want to identify which NePS searchers are using a specific searching algorithm (e.g., Bayesian Optimization, Hyperband, PriorBand...), you can use the `get_searcher_from_algorithm` function. It returns a list of searchers utilizing the specified algorithm:
86
139
87
140
```python
88
141
algorithm ="bayesian_optimization"# Replace with the desired algorithm
0 commit comments