Skip to content

Commit

Permalink
improve README writing, fix links, remove dead weight
Browse files Browse the repository at this point in the history
  • Loading branch information
DaStoll committed Jun 28, 2024
1 parent 092ef0d commit 08cba52
Showing 1 changed file with 19 additions and 42 deletions.
61 changes: 19 additions & 42 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,16 +5,16 @@
[![License](https://img.shields.io/pypi/l/neural-pipeline-search?color=informational)](LICENSE)
[![Tests](https://github.com/automl/neps/actions/workflows/tests.yaml/badge.svg)](https://github.com/automl/neps/actions)

Welcome to NePS, a powerful and flexible Python library for hyperparameter optimization (HPO) and neural architecture search (NAS) with its primary goal: enable HPO adoption in practice for deep learners!
Welcome to NePS, a powerful and flexible Python library for hyperparameter optimization (HPO) and neural architecture search (NAS) with its primary goal: enable HPO and NAS for deep learners!

NePS houses recently published and some more well-established algorithms that are all capable of being run massively parallel on any distributed setup, with tools to analyze runs, restart runs, etc.
NePS houses recently published and well-established algorithms that can all be run massively parallel on distributed setups, with tools to analyze runs, restart runs, etc., all tailored to the needs of deep learning experts.

Take a look at our [documentation](https://automl.github.io/neps/latest/) and continue following through current README for instructions on how to use NePS!
Take a look at our [documentation](https://automl.github.io/neps/latest/) for all the details on how to use NePS!


## Key Features

In addition to the common features offered by traditional HPO and NAS libraries, NePS stands out with the following key features:
In addition to the common features offered by traditional HPO and NAS libraries, NePS stands out with:

1. [**Hyperparameter Optimization (HPO) With Prior Knowledge:**](neps_examples/template/priorband_template.py)
- NePS excels in efficiently tuning hyperparameters using algorithms that enable users to make use of their prior knowledge within the search space. This is leveraged by the insights presented in:
Expand Down Expand Up @@ -89,38 +89,29 @@ def run_pipeline(
}


# 2. Define a search space of parameters; use the same names for the parameters as in run_pipeline
# 2. Define a search space of parameters; use the same parameter names as in run_pipeline
pipeline_space = dict(
hyperparameter_b=neps.IntegerParameter(
lower=1, upper=42, is_fidelity=True
), # Mark 'is_fidelity' as true for a multi-fidelity approach.
hyperparameter_a=neps.FloatParameter(
lower=0.001, upper=0.1, log=True
), # If True, the search space is sampled in log space.
architecture_parameter=neps.CategoricalParameter(
["option_a", "option_b", "option_c"]
lower=0.001, upper=0.1, log=True # The search space is sampled in log space
),
hyperparameter_b=neps.IntegerParameter(lower=1, upper=42),
architecture_parameter=neps.CategoricalParameter(["option_a", "option_b"]),
)

if __name__ == "__main__":
# 3. Run the NePS optimization
logging.basicConfig(level=logging.INFO)
neps.run(
run_pipeline=run_pipeline,
pipeline_space=pipeline_space,
root_directory="path/to/save/results", # Replace with the actual path.
max_evaluations_total=100,
searcher="hyperband" # Optional specifies the search strategy,
# otherwise NePs decides based on your data.
)

# 3. Run the NePS optimization
logging.basicConfig(level=logging.INFO)
neps.run(
run_pipeline=run_pipeline,
pipeline_space=pipeline_space,
root_directory="path/to/save/results", # Replace with the actual path.
max_evaluations_total=100,
)
```

## Examples

Discover how NePS works through these practical examples:
* **[Pipeline Space via YAML](neps_examples/basic_usage/hpo_usage_example.py)**: Explore how to define the `pipeline_space` using a
YAML file instead of a dictionary.

* **[Hyperparameter Optimization (HPO)](neps_examples/basic_usage/hyperparameters.py)**: Learn the essentials of hyperparameter optimization with NePS.

* **[Architecture Search with Primitives](neps_examples/basic_usage/architecture.py)**: Dive into architecture search using primitives in NePS.
Expand All @@ -131,24 +122,10 @@ Discover how NePS works through these practical examples:

* **[Additional NePS Examples](neps_examples/)**: Explore more examples, including various use cases and advanced configurations in NePS.

## Documentation

For more details and features please have a look at our [documentation](https://automl.github.io/neps/latest/)

## Analysing runs

See our [documentation on analysing runs](https://automl.github.io/neps/latest/analyse).

## Contributing

Please see the [documentation for contributors](https://automl.github.io/neps/latest/contributing/).
Please see the [documentation for contributors](https://automl.github.io/neps/latest/dev_docs/contributing/).

## Citations

Please consider citing us if you use our tool!

Refer to our [documentation on citations](https://automl.github.io/neps/latest/citations/).

## Alternatives

NePS does not cover your use-case? Have a look at [some alternatives](https://automl.github.io/neps/latest/alternatives).
For pointers on citing the NePS package and papers refer to our [documentation on citations](https://automl.github.io/neps/latest/citations/).

0 comments on commit 08cba52

Please sign in to comment.