Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove neps argument 'searcher_path' and yaml argument searcher_kwargs + New loading design for optimizer #105

Merged
merged 30 commits into from
Jun 21, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
30 commits
Select commit Hold shift + click to select a range
2c1d1b7
add mixed usage functionality for run_args and neps func arguments
danrgll May 29, 2024
f7373d9
align the tests for declarative usage to the ones that are in the doc…
danrgll May 29, 2024
68ed702
fix errors in tests
danrgll May 29, 2024
bf023e9
change design for pre-load-hooks
danrgll Jun 4, 2024
13231b2
simplify code
danrgll Jun 5, 2024
20cbb31
add docstring rm solved ToDos
danrgll Jun 5, 2024
044c61a
change design for searcher config
danrgll Jun 5, 2024
a2f259b
fix pipeline_space example
danrgll Jun 5, 2024
f8f36ae
rm neps argument searcher_path
danrgll Jun 7, 2024
3d55400
fix pre-commit error
danrgll Jun 7, 2024
54b2e63
Merge branch 'master' into new-optimizer-yaml-design
danrgll Jun 11, 2024
50d343e
merge master + fix loading neps-searchers from yaml
danrgll Jun 13, 2024
3b4769a
update docs to the new design
danrgll Jun 13, 2024
85c9fa5
define dict in run_args yaml
danrgll Jun 13, 2024
f17b1f1
change searcher key algorithm to strategy + rm searcher_kwargs argument
danrgll Jun 14, 2024
0d039a5
change algorithm to strategy
danrgll Jun 14, 2024
12bc542
update rm searcher_kwargs key from yaml for user
danrgll Jun 14, 2024
5f3e541
update rm searcher_kwargs key from yaml for user
danrgll Jun 14, 2024
c06dde7
fix pre-commit
danrgll Jun 14, 2024
ff566dd
add providing arguments for loaded class BaseOptimizer via yaml + fix…
danrgll Jun 17, 2024
fbc3d89
add searcher_args to searcher_info for custom class optimizer loaded …
danrgll Jun 17, 2024
659614f
add tests + fix docs
danrgll Jun 17, 2024
f23c882
update docs
danrgll Jun 17, 2024
736b58a
adapt SearcherConfigs to the new dict design of optimizers
danrgll Jun 17, 2024
285b993
clean up code and docsctrings
danrgll Jun 18, 2024
4417942
code clean up + add notes to docs
danrgll Jun 18, 2024
f15d58f
update declarative example
danrgll Jun 18, 2024
3cf2b73
fix test
danrgll Jun 19, 2024
1a530e5
change boolean values in yamls from [True, False] to [true, false]
danrgll Jun 21, 2024
9406896
fix path reference
danrgll Jun 21, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 5 additions & 3 deletions docs/doc_yamls/customizing_neps_optimizer.yaml
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
# Customizing NePS Searcher
run_pipeline:
path: path/to/your/run_pipeline.py # Path to the function file
name: example_pipeline # Function name within the file
Expand All @@ -6,15 +7,16 @@ pipeline_space:
learning_rate:
lower: 1e-5
upper: 1e-1
log: True # Log scale for learning rate
log: true # Log scale for learning rate
optimizer:
choices: [adam, sgd, adamw]
epochs: 50

root_directory: path/to/results # Directory for result storage
max_evaluations_total: 20 # Budget
max_evaluations_total: 20 # Budget
searcher:
algorithm: bayesian_optimization # name linked with neps keywords, more information click here..?
strategy: bayesian_optimization # key for neps searcher
name: "my_bayesian" # optional; changing the searcher_name for better recognition
# Specific arguments depending on the searcher
initial_design_size: 7
surrogate_model: gp
Expand Down
10 changes: 5 additions & 5 deletions docs/doc_yamls/defining_hooks.yaml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Basic NEPS Configuration Example
# Hooks
run_pipeline:
path: path/to/your/run_pipeline.py # Path to the function file
name: example_pipeline # Function name within the file
Expand All @@ -7,18 +7,18 @@ pipeline_space:
learning_rate:
lower: 1e-5
upper: 1e-1
log: True # Log scale for learning rate
log: true # Log scale for learning rate
epochs:
lower: 5
upper: 20
is_fidelity: True
is_fidelity: true
optimizer:
choices: [adam, sgd, adamw]
batch_size: 64

root_directory: path/to/results # Directory for result storage
max_evaluations_total: 20 # Budget
max_evaluations_total: 20 # Budget

pre_load_hooks:
hook1: path/to/your/hooks.py # (function_name: Path to the function's file)
hook2: path/to/your/hooks.py # Different function name from the same file source
hook2: path/to/your/hooks.py # Different function name 'hook2' from the same file source
10 changes: 5 additions & 5 deletions docs/doc_yamls/full_configuration_template.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -7,11 +7,11 @@ pipeline_space:
learning_rate:
lower: 1e-5
upper: 1e-1
log: True # Log scale for learning rate
log: true
epochs:
lower: 5
upper: 20
is_fidelity: True
is_fidelity: true
optimizer:
choices: [adam, sgd, adamw]
batch_size: 64
Expand All @@ -21,14 +21,14 @@ max_evaluations_total: 20 # Budget
max_cost_total:

# Debug and Monitoring
overwrite_working_directory: True
post_run_summary: False
overwrite_working_directory: true
post_run_summary: false
development_stage_id:
task_id:

# Parallelization Setup
max_evaluations_per_run:
continue_until_max_evaluation_completed: False
continue_until_max_evaluation_completed: false

# Error Handling
loss_value_on_error:
Expand Down
9 changes: 5 additions & 4 deletions docs/doc_yamls/loading_own_optimizer.yaml
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
# Loading Optimizer Class
run_pipeline:
path: path/to/your/run_pipeline.py # Path to the function file
name: example_pipeline # Function name within the file
Expand All @@ -6,16 +7,16 @@ pipeline_space:
learning_rate:
lower: 1e-5
upper: 1e-1
log: True # Log scale for learning rate
log: true # Log scale for learning rate
optimizer:
choices: [adam, sgd, adamw]
epochs: 50

root_directory: path/to/results # Directory for result storage
max_evaluations_total: 20 # Budget
max_evaluations_total: 20 # Budget
searcher:
path: path/to/your/searcher.py # Path to the class
name: CustomOptimizer # class name within the file
path: path/to/your/searcher.py # Path to the class
name: CustomOptimizer # class name within the file
# Specific arguments depending on your searcher
initial_design_size: 7
surrogate_model: gp
Expand Down
6 changes: 3 additions & 3 deletions docs/doc_yamls/loading_pipeline_space_dict.yaml
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
# Loading pipeline space from a python dict
run_pipeline:
path: path/to/your/run_pipeline.py # Path to the function file
name: example_pipeline # Function name within the file
name: example_pipeline # Function name within the file

pipeline_space:
path: path/to/your/search_space.py # Path to the dict file
name: pipeline_space # Name of the dict instance
name: pipeline_space # Name of the dict instance

root_directory: path/to/results # Directory for result storage
max_evaluations_total: 20 # Budget
max_evaluations_total: 20 # Budget
4 changes: 2 additions & 2 deletions docs/doc_yamls/outsourcing_optimizer.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -7,12 +7,12 @@ pipeline_space:
learning_rate:
lower: 1e-5
upper: 1e-1
log: True # Log scale for learning rate
log: true # Log scale for learning rate
optimizer:
choices: [adam, sgd, adamw]
epochs: 50

root_directory: path/to/results # Directory for result storage
max_evaluations_total: 20 # Budget
max_evaluations_total: 20 # Budget

searcher: path/to/your/searcher_setup.yaml
4 changes: 2 additions & 2 deletions docs/doc_yamls/outsourcing_pipeline_space.yaml
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
# Pipeline space settings from YAML
# Pipeline space settings from separate YAML
run_pipeline:
path: path/to/your/run_pipeline.py # Path to the function file
name: example_pipeline # Function name within the file

pipeline_space: path/to/your/pipeline_space.yaml

root_directory: path/to/results # Directory for result storage
max_evaluations_total: 20 # Budget
max_evaluations_total: 20 # Budget

43 changes: 21 additions & 22 deletions docs/doc_yamls/pipeline_space.yaml
Original file line number Diff line number Diff line change
@@ -1,22 +1,21 @@
# pipeline_space including priors and fidelity
pipeline_space:
learning_rate:
lower: 1e-5
upper: 1e-1
log: True # Log scale for learning rate
default: 1e-2
default_confidence: "medium"
epochs:
lower: 5
upper: 20
is_fidelity: True
dropout_rate:
lower: 0.1
upper: 0.5
default: 0.2
default_confidence: "high"
optimizer:
choices: [adam, sgd, adamw]
default: adam
# default confidence low
batch_size: 64
# Pipeline_space including priors and fidelity
learning_rate:
lower: 1e-5
upper: 1e-1
log: true # Log scale for learning rate
default: 1e-2
default_confidence: "medium"
epochs:
lower: 5
upper: 20
is_fidelity: true
dropout_rate:
lower: 0.1
upper: 0.5
default: 0.2
default_confidence: "high"
optimizer:
choices: [adam, sgd, adamw]
default: adam
# if default confidence is not defined it gets its default 'low'
batch_size: 64
2 changes: 1 addition & 1 deletion docs/doc_yamls/set_up_optimizer.yaml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
algorithm: bayesian_optimization
strategy: bayesian_optimization
# Specific arguments depending on the searcher
initial_design_size: 7
surrogate_model: gp
Expand Down
4 changes: 2 additions & 2 deletions docs/doc_yamls/simple_example.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,11 +3,11 @@ pipeline_space:
learning_rate:
lower: 1e-5
upper: 1e-1
log: True # Log scale for learning rate
log: true # Log scale for learning rate
epochs:
lower: 5
upper: 20
is_fidelity: True
is_fidelity: true
optimizer:
choices: [adam, sgd, adamw]
batch_size: 64
Expand Down
6 changes: 3 additions & 3 deletions docs/doc_yamls/simple_example_including_run_pipeline.yaml
Original file line number Diff line number Diff line change
@@ -1,17 +1,17 @@
# Simple NePS configuration including run_pipeline
run_pipeline:
path: path/to/your/run_pipeline.py # Path to the function file
name: example_pipeline # Function name within the file
name: example_pipeline # Function name within the file

pipeline_space:
learning_rate:
lower: 1e-5
upper: 1e-1
log: True # Log scale for learning rate
log: true # Log scale for learning rate
epochs:
lower: 5
upper: 20
is_fidelity: True
is_fidelity: true
optimizer:
choices: [adam, sgd, adamw]
batch_size: 64
Expand Down
25 changes: 19 additions & 6 deletions docs/reference/declarative_usage.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,13 @@
!!! note "Work in Progress"
This document is currently a work in progress and may contain incomplete or preliminary information.

## Introduction
### Configuring with YAML
Configure your experiments using a YAML file, which serves as a central reference for setting up your project.
This approach simplifies sharing, reproducing, and modifying configurations.
This approach simplifies sharing, reproducing and modifying configurations.

!!! note
You can partially define arguments in the YAML file and partially provide the arguments directly to `neps.run`.
However, double referencing is not allowed. You cannot define the same argument in both places.

#### Simple YAML Example
Below is a straightforward YAML configuration example for NePS covering the required arguments.
=== "config.yaml"
Expand All @@ -27,7 +30,7 @@ Below is a straightforward YAML configuration example for NePS covering the requ
```


#### Including `run_pipeline` in config.yaml for External Referencing
#### Including `run_pipeline` in `run_args` for External Referencing
In addition to setting experimental parameters via YAML, this configuration example also specifies the pipeline function
and its location, enabling more flexible project structures.
=== "config.yaml"
Expand Down Expand Up @@ -65,7 +68,7 @@ but also advanced settings for more complex setups.

The `searcher` key used in the YAML configuration corresponds to the same keys used for selecting an optimizer directly
through `neps.run`. For a detailed list of integrated optimizers, see [here](optimizers.md#list-available-searchers)
!!! note "Note on Undefined Keys"
!!! note "Note on undefined keys in `run_args` (config.yaml)"
Not all configurations are explicitly defined in this template. Any undefined key in the YAML file is mapped to
the internal default settings of NePS. This ensures that your experiments can run even if certain parameters are
omitted.
Expand All @@ -74,6 +77,11 @@ through `neps.run`. For a detailed list of integrated optimizers, see [here](opt
### Customizing NePS optimizer
Customize an internal NePS optimizer by specifying its parameters directly under the key `searcher` in the
`config.yaml` file.

!!! note
For `searcher_kwargs` of `neps.run`, the optimizer arguments passed via the YAML file and those passed directly via
`neps.run` will be merged. In this special case, if the same argument is referenced in both places,
`searcher_kwargs` will be prioritized and set for this argument.
=== "config.yaml"
```yaml
--8<-- "docs/doc_yamls/customizing_neps_optimizer.yaml"
Expand Down Expand Up @@ -158,7 +166,10 @@ search spaces must be loaded via a dictionary, which is then referenced in the `


### Integrating Custom Optimizers
You can also load your own custom optimizer and change its arguments in `config.yaml`.
For people who want to write their own optimizer class as a subclass of the base optimizer, you can load your own
custom optimizer class and define its arguments in `config.yaml`.

Note: You can still overwrite arguments via searcher_kwargs of `neps.run` like for the internal searchers.
=== "config.yaml"
```yaml
--8<-- "docs/doc_yamls/loading_own_optimizer.yaml"
Expand All @@ -173,6 +184,8 @@ You can also load your own custom optimizer and change its arguments in `config.
neps.run(run_args="path/to/your/config.yaml")
```



### Adding Custom Hooks to Your Configuration
Define hooks in your YAML configuration to extend the functionality of your experiment.
=== "config.yaml"
Expand Down
36 changes: 12 additions & 24 deletions docs/reference/neps_run.md
Original file line number Diff line number Diff line change
Expand Up @@ -190,11 +190,7 @@ Any new workers that come online will automatically pick up work and work togeth

## YAML Configuration
You have the option to configure all arguments using a YAML file through [`neps.run(run_args=...)`][neps.api.run].
For more on yaml usage, please visit the dedicated [page on usage of YAML with NePS](../reference/yaml_usage.md).

!!! example "In Progress"

This feature is currently in development and is subject to change.
For more on yaml usage, please visit the dedicated [page on usage of YAML with NePS](../reference/declarative_usage.md).

Parameters not explicitly defined within this file will receive their default values.

Expand All @@ -203,18 +199,17 @@ Parameters not explicitly defined within this file will receive their default va

```yaml
# path/to/your/config.yaml
run_args:
run_pipeline:
path: "path/to/your/run_pipeline.py" # File path of the run_pipeline function
name: "name_of_your_run_pipeline" # Function name
pipeline_space: "path/to/your/search_space.yaml" # Path of the search space yaml file
root_directory: "neps_results" # Output directory for results
max_evaluations_total: 100
post_run_summary: # Defaults applied if left empty
searcher: "bayesian_optimization"
searcher_kwargs:
initial_design_size: 5
surrogate_model: "gp"
run_pipeline:
path: "path/to/your/run_pipeline.py" # File path of the run_pipeline function
name: "name_of_your_run_pipeline" # Function name
pipeline_space: "path/to/your/search_space.yaml" # Path of the search space yaml file
root_directory: "neps_results" # Output directory for results
max_evaluations_total: 100
post_run_summary: # Defaults applied if left empty
searcher:
strategy: "bayesian_optimization"
initial_design_size: 5
surrogate_model: "gp"
```

=== "Python"
Expand All @@ -223,13 +218,6 @@ Parameters not explicitly defined within this file will receive their default va
neps.run(run_args="path/to/your/config.yaml")
```

!!! warning

Currently we have a strict usage for `run_args`.
So you can define either all arguments by providing them directly to neps.run or via the yaml file.
This might change in the future.
If you use yaml, directly provided arguments get overwritten either by the defined yaml config or the default value.

## Handling Errors
Things go wrong during optimization runs and it's important to consider what to do in these cases.
By default, NePS will halt the optimization process when an error but you can choose to `ignore_errors=`, providing a `loss_value_on_error=` and `cost_value_on_error=` to control what values should be reported to the optimization process.
Expand Down
Loading
Loading