Skip to content

Commit

Permalink
Iterating yaml api PR
Browse files Browse the repository at this point in the history
  • Loading branch information
Neeratyoy committed Oct 6, 2023
1 parent 9f46b70 commit d3d06b5
Show file tree
Hide file tree
Showing 15 changed files with 109 additions and 82 deletions.
12 changes: 6 additions & 6 deletions src/neps/api.py
Original file line number Diff line number Diff line change
Expand Up @@ -248,10 +248,10 @@ def run(
# NePS decides the searcher according to the pipeline space.
if pipeline_space.has_prior:
searcher = "priorband" if pipeline_space.has_fidelity else "pibo"
elif pipeline_space.has_fidelity:
searcher = "hyperband"
else:
searcher = "bayesian_optimization"
searcher = (
"hyperband" if pipeline_space.has_fidelity else "bayesian_optimization"
)
else:
# Users choose one of NePS searchers.
user_defined_searcher = True
Expand Down Expand Up @@ -318,14 +318,14 @@ def run(
searcher_instance,
searcher_info,
root_directory,
development_stage_id=development_stage_id,
task_id=task_id,
max_evaluations_total=max_evaluations_total,
max_evaluations_per_run=max_evaluations_per_run,
overwrite_optimization_dir=overwrite_working_directory,
continue_until_max_evaluation_completed=continue_until_max_evaluation_completed,
development_stage_id=development_stage_id,
task_id=task_id,
logger=logger,
post_evaluation_hook=_post_evaluation_hook_function(
loss_value_on_error, ignore_errors
),
overwrite_optimization_dir=overwrite_working_directory,
)
85 changes: 69 additions & 16 deletions src/neps/optimizers/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,41 +8,94 @@ If you prefer not to specify a particular optimizer for your AutoML task, you ca

The optimizer selection is based on the following characteristics of your search space:

- If it has fidelity: hyperband
- If it has a prior: pibo
- If it has both fidelity and a prior: priorband
- If it has neither: bayesian_optimization
- If it has fidelity: `hyperband`
- If it has both fidelity and a prior: `priorband`
- If it has a prior: `pibo`
- If it has neither: `bayesian_optimization`

For example, running the following format, without specifying a searcher will choose an optimizer depending on the `pipeline_space` passed.
```python
neps.run(
run_pipeline=run_function,
pipeline_space=pipeline_space,
root_directory="results/",
max_evaluations_total=25,
# no searcher specified
)
```

### 2. Choosing one of NePS Optimizers

We have also prepared some optimizers with specific hyperparameters that we believe can generalize well to most AutoML tasks and use cases. For more details on the available default optimizers and the algorithms that can be called, please refer to the next section on SearcherConfigs.
We have also prepared some optimizers with specific hyperparameters that we believe can generalize well to most AutoML tasks and use cases. For more details on the available default optimizers and the algorithms that can be called, please refer to the next section on [SearcherConfigs](#Searcher-Configurations).

```python
neps.run(
run_pipeline=run_function,
pipeline_space=pipeline_space,
root_directory="results/",
max_evaluations_total=25,
# searcher specified, along with an argument
searcher="bayesian_optimization",
initial_design_size=5,
)
```

For more optimizers, please refer [here](#List-Available-Searchers) .

### 3. Custom Optimizer Configuration via YAML

For users who want more control over the optimizer's hyperparameters, you can create your own YAML configuration file. In this file, you can specify the hyperparameters for your preferred optimizer. To use this custom configuration, provide the path to your YAML file using the `searcher_path` parameter when running the optimizer. The library will then load your custom settings and use them for optimization.

Here's the format of the YAML configuration using `Bayesian Optimization` as an example:
Here's the format of a custom YAML (`custom_bo.yaml`) configuration using `Bayesian Optimization` as an example:

```yaml
searcher_init:
algorithm: bayesian_optimization
searcher_kwargs: # Specific arguments depending on the searcher
initial_design_size: 5
surrogate_model: gp_hierarchy # or {"gp_hierarchy", "deep_gp"}
acquisition: EI # or {"LogEI", "AEI", "MFEI"}
searcher_kwargs: # Specific arguments depending on the searcher
initial_design_size: 7
surrogate_model: gp
acquisition: EI
log_prior_weighted: false
acquisition_sampler: random # or {"mutation", "evolution", "freeze-thaw"}
random_interleave_prob: 0.0
acquisition_sampler: random
random_interleave_prob: 0.1
disable_priors: false
prior_confidence: high
sample_default_first: true
sample_default_first: false
```
```python
neps.run(
run_pipeline=run_function,
pipeline_space=pipeline_space,
root_directory="results/",
max_evaluations_total=25,
# searcher specified, along with an argument
searcher_path = "custom/path/to/directory"
# `custom_bo.yaml` should be in `searcher_path`
searcher="custom_bo",
)
```

### 4. Hyperparameter Overrides

If you want to make on-the-fly adjustments to the optimizer's hyperparameters without modifying the YAML configuration file, you can do so by passing keyword arguments (kwargs) to the neps.run function itself. This enables you to fine-tune specific hyperparameters without the need for YAML file updates. Any hyperparameter values provided as kwargs will take precedence over those specified in the YAML configuration.

### Note for Contributors
```python
neps.run(
run_pipeline=run_function,
pipeline_space=pipeline_space,
root_directory="results/",
max_evaluations_total=25,
# searcher specified, along with an argument
searcher_path = "custom/path/to/directory"
# `custom_bo.yaml` should be in `searcher_path`
searcher="custom_bo",
initial_design_size=5, # overrides value in custom_bo.yaml
random_interleave_prob: 0.25 # overrides value in custom_bo.yaml
)
```

## Note for Contributors

When designing a new optimizer, it's essential to create a YAML configuration file in the `default_searcher` folder under `neps.src.optimizers`. This YAML file should contain the default configuration settings that you believe should be used when the user chooses the searcher.

Expand Down Expand Up @@ -82,11 +135,11 @@ print("Available searching algorithms:", algorithms)

### Find Searchers Using a Specific Algorithm

If you want to identify which NePS searchers are using a specific searching algorithm (e.g., Bayesian Optimization, Hyperband, PriorBand...), you can use the `get_searcher_from_alg` function. It returns a list of searchers utilizing the specified algorithm:
If you want to identify which NePS searchers are using a specific searching algorithm (e.g., Bayesian Optimization, Hyperband, PriorBand...), you can use the `get_searcher_from_algorithm` function. It returns a list of searchers utilizing the specified algorithm:

```python
algorithm = "bayesian_optimization" # Replace with the desired algorithm
searchers = SearcherConfigs.get_searcher_from_alg(algorithm)
searchers = SearcherConfigs.get_searcher_from_algorithm(algorithm)
print(f"Searchers using {algorithm}:", searchers)
```

Expand Down
2 changes: 1 addition & 1 deletion src/neps/optimizers/default_searchers/asha.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ searcher_kwargs:
# Arguments that can be modified by the user
eta: 3
early_stopping_rate: 0
initial_design_type: max_budget # or {"unique_configs"}
initial_design_type: max_budget
use_priors: false
random_interleave_prob: 0.0
sample_default_first: false
Expand Down
4 changes: 2 additions & 2 deletions src/neps/optimizers/default_searchers/asha_prior.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,8 @@ searcher_kwargs:
# Arguments that can be modified by the user
eta: 3
early_stopping_rate: 0
initial_design_type: max_budget # or {"unique_configs"}
prior_confidence: medium # or {"low", "high"}
initial_design_type: max_budget
prior_confidence: medium # or {"low", "high"}
random_interleave_prob: 0.0
sample_default_first: false
sample_default_at_target: false
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,10 @@ searcher_init:
searcher_kwargs:
# Arguments that can be modified by the user
initial_design_size: 10
surrogate_model: gp # or {"gp_hierarchy", "deep_gp"}
acquisition: EI # or {"LogEI", "AEI", "MFEI"}
surrogate_model: gp # or {"gp_hierarchy"}
acquisition: EI # or {"LogEI", "AEI"}
log_prior_weighted: false
acquisition_sampler: mutation # or {"random", "evolution", "freeze-thaw"}
acquisition_sampler: mutation # or {"random", "evolution"}
random_interleave_prob: 0.0
disable_priors: true
sample_default_first: false
Expand Down
2 changes: 1 addition & 1 deletion src/neps/optimizers/default_searchers/hyperband.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ searcher_init:
searcher_kwargs:
# Arguments that can be modified by the user
eta: 3
initial_design_type: max_budget # or {"unique_configs"}
initial_design_type: max_budget
use_priors: false
random_interleave_prob: 0.0
sample_default_first: false
Expand Down
26 changes: 0 additions & 26 deletions src/neps/optimizers/default_searchers/mf_ei_bo.yaml

This file was deleted.

8 changes: 4 additions & 4 deletions src/neps/optimizers/default_searchers/mobster.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,17 +3,17 @@ searcher_init:
searcher_kwargs:
# Arguments that can be modified by the user
eta: 3
initial_design_type: max_budget # or {"unique_configs"}
initial_design_type: max_budget
use_priors: false
random_interleave_prob: 0.0
sample_default_first: false
sample_default_at_target: false

# arguments for model
surrogate_model: gp # or {"gp_hierarchy", "deep_gp"}
acquisition: EI # or {"LogEI", "AEI", "MFEI"}
surrogate_model: gp # or {"gp_hierarchy"}
acquisition: EI # or {"LogEI", "AEI"}
log_prior_weighted: false
acquisition_sampler: random # or {"mutation", "evolution", "freeze-thaw"}
acquisition_sampler: random # or {"mutation", "evolution"}

# Arguments that can not be modified by the user
# sampling_policy: RandomUniformPolicy
Expand Down
8 changes: 4 additions & 4 deletions src/neps/optimizers/default_searchers/pibo.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,13 +3,13 @@ searcher_init:
searcher_kwargs:
# Arguments that can be modified by the user
initial_design_size: 10
surrogate_model: gp # or {"gp_hierarchy", "deep_gp"}
acquisition: EI # or {"LogEI", "AEI", "MFEI"}
surrogate_model: gp # or {"gp_hierarchy"}
acquisition: EI # or {"LogEI", "AEI"}
log_prior_weighted: false
acquisition_sampler: mutation # or {"random", "evolution", "freeze-thaw"}
acquisition_sampler: mutation # or {"random", "evolution"}
random_interleave_prob: 0.0
disable_priors: false
prior_confidence: medium # or {"low", "high"}
prior_confidence: medium # or {"low", "high"}
sample_default_first: false

# Other arguments:
Expand Down
10 changes: 5 additions & 5 deletions src/neps/optimizers/default_searchers/priorband.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,16 +3,16 @@ searcher_init:
searcher_kwargs:
# Arguments that can be modified by the user
eta: 3
initial_design_type: max_budget # or {"unique_configs"}
prior_confidence: medium # or {"low", "high"}
initial_design_type: max_budget
prior_confidence: medium # or {"low", "high"}
random_interleave_prob: 0.0
sample_default_first: true
sample_default_at_target: false
prior_weight_type: geometric # or {"linear", "50-50"}
inc_sample_type: mutation # or {"crossover", "gaussian", "hypersphere"}
prior_weight_type: geometric
inc_sample_type: mutation
inc_mutation_rate: 0.5
inc_mutation_std: 0.25
inc_style: dynamic # or {"decay", "constant"}
inc_style: dynamic

# arguments for model
model_based: false # crucial argument to set to allow model-search
Expand Down
18 changes: 9 additions & 9 deletions src/neps/optimizers/default_searchers/priorband_bo.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,25 +3,25 @@ searcher_init:
searcher_kwargs:
# Arguments that can be modified by the user
eta: 3
initial_design_type: max_budget # or {"unique_configs"}
prior_confidence: medium # or {"low", "high"}
initial_design_type: max_budget
prior_confidence: medium # or {"low", "high"}
random_interleave_prob: 0.0
sample_default_first: true
sample_default_at_target: false
prior_weight_type: geometric # or {"linear", "50-50"}
inc_sample_type: mutation # or {"crossover", "gaussian", "hypersphere"}
prior_weight_type: geometric
inc_sample_type: mutation
inc_mutation_rate: 0.5
inc_mutation_std: 0.25
inc_style: dynamic # or {"decay", "constant"}
inc_style: dynamic

# arguments for model
model_based: true # crucial argument to set to allow model-search
modelling_type: joint # or {"rung"}
modelling_type: joint
initial_design_size: 10
surrogate_model: gp # or {"gp_hierarchy", "deep_gp"}
acquisition: EI # or {"LogEI", "AEI", "MFEI"}
surrogate_model: gp # or {"gp_hierarchy"}
acquisition: EI # or {"LogEI", "AEI"}
log_prior_weighted: false
acquisition_sampler: mutation # or {"random", "evolution", "freeze-thaw"}
acquisition_sampler: mutation # or {"random", "evolution"}

# Arguments that can not be modified by the user
# sampling_policy: EnsemblePolicy
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ searcher_kwargs:
# Arguments that can be modified by the user
eta: 3
early_stopping_rate: 0
initial_design_type: max_budget # or {"unique_configs"}
initial_design_type: max_budget
use_priors: false
random_interleave_prob: 0.0
sample_default_first: false
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,8 @@ searcher_kwargs:
# Arguments that can be modified by the user
eta: 3
early_stopping_rate: 0
initial_design_type: max_budget # or {"unique_configs"}
prior_confidence: medium # or {"low", "high"}
initial_design_type: max_budget
prior_confidence: medium # or {"low", "high"}
random_interleave_prob: 0.0
sample_default_first: false
sample_default_at_target: false
Expand Down
2 changes: 1 addition & 1 deletion src/neps/optimizers/info.py
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ def get_available_algorithms() -> list[str]:
return list(prev_algorithms)

@staticmethod
def get_searcher_from_alg(algorithm: str) -> list[str]:
def get_searcher_from_algorithm(algorithm: str) -> list[str]:
"""
Get all NePS searchers that use a specific searching algorithm.
Expand Down
2 changes: 1 addition & 1 deletion src/neps/utils/common.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ def get_searcher_data(searcher: str) -> str:
resource_path = os.path.join(parent_directory, folder_path, f"{searcher}.yaml")

if not os.path.exists(resource_path):
raise FileNotFoundError(f"Searcher '{searcher}' does not exit.")
raise FileNotFoundError(f"Searcher '{searcher}' not exist.")

with open(resource_path, "rb") as file:
data = yaml.safe_load(file)
Expand Down

0 comments on commit d3d06b5

Please sign in to comment.