Skip to content

Commit d3d06b5

Browse files
committed
Iterating yaml api PR
1 parent 9f46b70 commit d3d06b5

15 files changed

+109
-82
lines changed

src/neps/api.py

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -248,10 +248,10 @@ def run(
248248
# NePS decides the searcher according to the pipeline space.
249249
if pipeline_space.has_prior:
250250
searcher = "priorband" if pipeline_space.has_fidelity else "pibo"
251-
elif pipeline_space.has_fidelity:
252-
searcher = "hyperband"
253251
else:
254-
searcher = "bayesian_optimization"
252+
searcher = (
253+
"hyperband" if pipeline_space.has_fidelity else "bayesian_optimization"
254+
)
255255
else:
256256
# Users choose one of NePS searchers.
257257
user_defined_searcher = True
@@ -318,14 +318,14 @@ def run(
318318
searcher_instance,
319319
searcher_info,
320320
root_directory,
321-
development_stage_id=development_stage_id,
322-
task_id=task_id,
323321
max_evaluations_total=max_evaluations_total,
324322
max_evaluations_per_run=max_evaluations_per_run,
325-
overwrite_optimization_dir=overwrite_working_directory,
326323
continue_until_max_evaluation_completed=continue_until_max_evaluation_completed,
324+
development_stage_id=development_stage_id,
325+
task_id=task_id,
327326
logger=logger,
328327
post_evaluation_hook=_post_evaluation_hook_function(
329328
loss_value_on_error, ignore_errors
330329
),
330+
overwrite_optimization_dir=overwrite_working_directory,
331331
)

src/neps/optimizers/README.md

Lines changed: 69 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -8,41 +8,94 @@ If you prefer not to specify a particular optimizer for your AutoML task, you ca
88

99
The optimizer selection is based on the following characteristics of your search space:
1010

11-
- If it has fidelity: hyperband
12-
- If it has a prior: pibo
13-
- If it has both fidelity and a prior: priorband
14-
- If it has neither: bayesian_optimization
11+
- If it has fidelity: `hyperband`
12+
- If it has both fidelity and a prior: `priorband`
13+
- If it has a prior: `pibo`
14+
- If it has neither: `bayesian_optimization`
15+
16+
For example, running the following format, without specifying a searcher will choose an optimizer depending on the `pipeline_space` passed.
17+
```python
18+
neps.run(
19+
run_pipeline=run_function,
20+
pipeline_space=pipeline_space,
21+
root_directory="results/",
22+
max_evaluations_total=25,
23+
# no searcher specified
24+
)
25+
```
1526

1627
### 2. Choosing one of NePS Optimizers
1728

18-
We have also prepared some optimizers with specific hyperparameters that we believe can generalize well to most AutoML tasks and use cases. For more details on the available default optimizers and the algorithms that can be called, please refer to the next section on SearcherConfigs.
29+
We have also prepared some optimizers with specific hyperparameters that we believe can generalize well to most AutoML tasks and use cases. For more details on the available default optimizers and the algorithms that can be called, please refer to the next section on [SearcherConfigs](#Searcher-Configurations).
30+
31+
```python
32+
neps.run(
33+
run_pipeline=run_function,
34+
pipeline_space=pipeline_space,
35+
root_directory="results/",
36+
max_evaluations_total=25,
37+
# searcher specified, along with an argument
38+
searcher="bayesian_optimization",
39+
initial_design_size=5,
40+
)
41+
```
42+
43+
For more optimizers, please refer [here](#List-Available-Searchers) .
1944

2045
### 3. Custom Optimizer Configuration via YAML
2146

2247
For users who want more control over the optimizer's hyperparameters, you can create your own YAML configuration file. In this file, you can specify the hyperparameters for your preferred optimizer. To use this custom configuration, provide the path to your YAML file using the `searcher_path` parameter when running the optimizer. The library will then load your custom settings and use them for optimization.
2348

24-
Here's the format of the YAML configuration using `Bayesian Optimization` as an example:
49+
Here's the format of a custom YAML (`custom_bo.yaml`) configuration using `Bayesian Optimization` as an example:
2550

2651
```yaml
2752
searcher_init:
2853
algorithm: bayesian_optimization
29-
searcher_kwargs: # Specific arguments depending on the searcher
30-
initial_design_size: 5
31-
surrogate_model: gp_hierarchy # or {"gp_hierarchy", "deep_gp"}
32-
acquisition: EI # or {"LogEI", "AEI", "MFEI"}
54+
searcher_kwargs: # Specific arguments depending on the searcher
55+
initial_design_size: 7
56+
surrogate_model: gp
57+
acquisition: EI
3358
log_prior_weighted: false
34-
acquisition_sampler: random # or {"mutation", "evolution", "freeze-thaw"}
35-
random_interleave_prob: 0.0
59+
acquisition_sampler: random
60+
random_interleave_prob: 0.1
3661
disable_priors: false
3762
prior_confidence: high
38-
sample_default_first: true
63+
sample_default_first: false
64+
```
65+
66+
```python
67+
neps.run(
68+
run_pipeline=run_function,
69+
pipeline_space=pipeline_space,
70+
root_directory="results/",
71+
max_evaluations_total=25,
72+
# searcher specified, along with an argument
73+
searcher_path = "custom/path/to/directory"
74+
# `custom_bo.yaml` should be in `searcher_path`
75+
searcher="custom_bo",
76+
)
3977
```
4078

4179
### 4. Hyperparameter Overrides
4280

4381
If you want to make on-the-fly adjustments to the optimizer's hyperparameters without modifying the YAML configuration file, you can do so by passing keyword arguments (kwargs) to the neps.run function itself. This enables you to fine-tune specific hyperparameters without the need for YAML file updates. Any hyperparameter values provided as kwargs will take precedence over those specified in the YAML configuration.
4482

45-
### Note for Contributors
83+
```python
84+
neps.run(
85+
run_pipeline=run_function,
86+
pipeline_space=pipeline_space,
87+
root_directory="results/",
88+
max_evaluations_total=25,
89+
# searcher specified, along with an argument
90+
searcher_path = "custom/path/to/directory"
91+
# `custom_bo.yaml` should be in `searcher_path`
92+
searcher="custom_bo",
93+
initial_design_size=5, # overrides value in custom_bo.yaml
94+
random_interleave_prob: 0.25 # overrides value in custom_bo.yaml
95+
)
96+
```
97+
98+
## Note for Contributors
4699

47100
When designing a new optimizer, it's essential to create a YAML configuration file in the `default_searcher` folder under `neps.src.optimizers`. This YAML file should contain the default configuration settings that you believe should be used when the user chooses the searcher.
48101

@@ -82,11 +135,11 @@ print("Available searching algorithms:", algorithms)
82135

83136
### Find Searchers Using a Specific Algorithm
84137

85-
If you want to identify which NePS searchers are using a specific searching algorithm (e.g., Bayesian Optimization, Hyperband, PriorBand...), you can use the `get_searcher_from_alg` function. It returns a list of searchers utilizing the specified algorithm:
138+
If you want to identify which NePS searchers are using a specific searching algorithm (e.g., Bayesian Optimization, Hyperband, PriorBand...), you can use the `get_searcher_from_algorithm` function. It returns a list of searchers utilizing the specified algorithm:
86139

87140
```python
88141
algorithm = "bayesian_optimization" # Replace with the desired algorithm
89-
searchers = SearcherConfigs.get_searcher_from_alg(algorithm)
142+
searchers = SearcherConfigs.get_searcher_from_algorithm(algorithm)
90143
print(f"Searchers using {algorithm}:", searchers)
91144
```
92145

src/neps/optimizers/default_searchers/asha.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ searcher_kwargs:
44
# Arguments that can be modified by the user
55
eta: 3
66
early_stopping_rate: 0
7-
initial_design_type: max_budget # or {"unique_configs"}
7+
initial_design_type: max_budget
88
use_priors: false
99
random_interleave_prob: 0.0
1010
sample_default_first: false

src/neps/optimizers/default_searchers/asha_prior.yaml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,8 +4,8 @@ searcher_kwargs:
44
# Arguments that can be modified by the user
55
eta: 3
66
early_stopping_rate: 0
7-
initial_design_type: max_budget # or {"unique_configs"}
8-
prior_confidence: medium # or {"low", "high"}
7+
initial_design_type: max_budget
8+
prior_confidence: medium # or {"low", "high"}
99
random_interleave_prob: 0.0
1010
sample_default_first: false
1111
sample_default_at_target: false

src/neps/optimizers/default_searchers/bayesian_optimization.yaml

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -3,10 +3,10 @@ searcher_init:
33
searcher_kwargs:
44
# Arguments that can be modified by the user
55
initial_design_size: 10
6-
surrogate_model: gp # or {"gp_hierarchy", "deep_gp"}
7-
acquisition: EI # or {"LogEI", "AEI", "MFEI"}
6+
surrogate_model: gp # or {"gp_hierarchy"}
7+
acquisition: EI # or {"LogEI", "AEI"}
88
log_prior_weighted: false
9-
acquisition_sampler: mutation # or {"random", "evolution", "freeze-thaw"}
9+
acquisition_sampler: mutation # or {"random", "evolution"}
1010
random_interleave_prob: 0.0
1111
disable_priors: true
1212
sample_default_first: false

src/neps/optimizers/default_searchers/hyperband.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ searcher_init:
33
searcher_kwargs:
44
# Arguments that can be modified by the user
55
eta: 3
6-
initial_design_type: max_budget # or {"unique_configs"}
6+
initial_design_type: max_budget
77
use_priors: false
88
random_interleave_prob: 0.0
99
sample_default_first: false

src/neps/optimizers/default_searchers/mf_ei_bo.yaml

Lines changed: 0 additions & 26 deletions
This file was deleted.

src/neps/optimizers/default_searchers/mobster.yaml

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -3,17 +3,17 @@ searcher_init:
33
searcher_kwargs:
44
# Arguments that can be modified by the user
55
eta: 3
6-
initial_design_type: max_budget # or {"unique_configs"}
6+
initial_design_type: max_budget
77
use_priors: false
88
random_interleave_prob: 0.0
99
sample_default_first: false
1010
sample_default_at_target: false
1111

1212
# arguments for model
13-
surrogate_model: gp # or {"gp_hierarchy", "deep_gp"}
14-
acquisition: EI # or {"LogEI", "AEI", "MFEI"}
13+
surrogate_model: gp # or {"gp_hierarchy"}
14+
acquisition: EI # or {"LogEI", "AEI"}
1515
log_prior_weighted: false
16-
acquisition_sampler: random # or {"mutation", "evolution", "freeze-thaw"}
16+
acquisition_sampler: random # or {"mutation", "evolution"}
1717

1818
# Arguments that can not be modified by the user
1919
# sampling_policy: RandomUniformPolicy

src/neps/optimizers/default_searchers/pibo.yaml

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -3,13 +3,13 @@ searcher_init:
33
searcher_kwargs:
44
# Arguments that can be modified by the user
55
initial_design_size: 10
6-
surrogate_model: gp # or {"gp_hierarchy", "deep_gp"}
7-
acquisition: EI # or {"LogEI", "AEI", "MFEI"}
6+
surrogate_model: gp # or {"gp_hierarchy"}
7+
acquisition: EI # or {"LogEI", "AEI"}
88
log_prior_weighted: false
9-
acquisition_sampler: mutation # or {"random", "evolution", "freeze-thaw"}
9+
acquisition_sampler: mutation # or {"random", "evolution"}
1010
random_interleave_prob: 0.0
1111
disable_priors: false
12-
prior_confidence: medium # or {"low", "high"}
12+
prior_confidence: medium # or {"low", "high"}
1313
sample_default_first: false
1414

1515
# Other arguments:

src/neps/optimizers/default_searchers/priorband.yaml

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -3,16 +3,16 @@ searcher_init:
33
searcher_kwargs:
44
# Arguments that can be modified by the user
55
eta: 3
6-
initial_design_type: max_budget # or {"unique_configs"}
7-
prior_confidence: medium # or {"low", "high"}
6+
initial_design_type: max_budget
7+
prior_confidence: medium # or {"low", "high"}
88
random_interleave_prob: 0.0
99
sample_default_first: true
1010
sample_default_at_target: false
11-
prior_weight_type: geometric # or {"linear", "50-50"}
12-
inc_sample_type: mutation # or {"crossover", "gaussian", "hypersphere"}
11+
prior_weight_type: geometric
12+
inc_sample_type: mutation
1313
inc_mutation_rate: 0.5
1414
inc_mutation_std: 0.25
15-
inc_style: dynamic # or {"decay", "constant"}
15+
inc_style: dynamic
1616

1717
# arguments for model
1818
model_based: false # crucial argument to set to allow model-search

0 commit comments

Comments
 (0)