-
Notifications
You must be signed in to change notification settings - Fork 13
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
Showing
37 changed files
with
618 additions
and
177 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,18 +1,32 @@ | ||
# Installation | ||
# Neural Pipeline Search (NePS) | ||
|
||
## Install from pip | ||
[![PyPI version](https://img.shields.io/pypi/v/neural-pipeline-search?color=informational)](https://pypi.org/project/neural-pipeline-search/) | ||
[![Python versions](https://img.shields.io/pypi/pyversions/neural-pipeline-search)](https://pypi.org/project/neural-pipeline-search/) | ||
[![License](https://img.shields.io/pypi/l/neural-pipeline-search?color=informational)](LICENSE) | ||
[![Tests](https://github.com/automl/neps/actions/workflows/tests.yaml/badge.svg)](https://github.com/automl/neps/actions) | ||
|
||
```bash | ||
pip install neural-pipeline-search | ||
``` | ||
Welcome to NePS, a powerful and flexible Python library for hyperparameter optimization (HPO) and neural architecture search (NAS) with its primary goal: enable HPO adoption in practice for deep learners! | ||
|
||
## Install from source | ||
NePS houses recently published and some more well-established algorithms that are all capable of being run massively parallel on any distributed setup, with tools to analyze runs, restart runs, etc. | ||
|
||
!!! note | ||
We use [poetry](https://python-poetry.org/docs/) to manage dependecies. | ||
|
||
```bash | ||
git clone https://github.com/automl/neps.git | ||
cd neps | ||
poetry install --no-dev | ||
``` | ||
## Key Features | ||
|
||
In addition to the common features offered by traditional HPO and NAS libraries, NePS stands out with the following key features: | ||
|
||
1. [**Hyperparameter Optimization (HPO) With Prior Knowledge:**](https://github.com/automl/neps/tree/master/neps_examples/template/priorband_template.py) | ||
- NePS excels in efficiently tuning hyperparameters using algorithms that enable users to make use of their prior knowledge within the search space. This is leveraged by the insights presented in: | ||
- [PriorBand: Practical Hyperparameter Optimization in the Age of Deep Learning](https://arxiv.org/abs/2306.12370) | ||
- [πBO: Augmenting Acquisition Functions with User Beliefs for Bayesian Optimization](https://arxiv.org/abs/2204.11051) | ||
|
||
2. [**Neural Architecture Search (NAS) With Context-free Grammar Search Spaces:**](https://github.com/automl/neps/tree/master/neps_examples/basic_usage/architecture.py) | ||
- NePS is equipped to handle context-free grammar search spaces, providing advanced capabilities for designing and optimizing architectures. this is leveraged by the insights presented in: | ||
- [Construction of Hierarchical Neural Architecture Search Spaces based on Context-free Grammars](https://arxiv.org/abs/2211.01842) | ||
|
||
3. [**Easy Parallelization and Resumption of Runs:**](https://automl.github.io/neps/latest/parallelization) | ||
- NePS simplifies the process of parallelizing optimization tasks both on individual computers and in distributed | ||
computing environments. It also allows users to conveniently resume these optimization tasks after completion to | ||
ensure a seamless and efficient workflow for long-running experiments. | ||
|
||
4. [**Seamless User Code Integration:**](https://github.com/automl/neps/tree/master/neps_examples/template/) | ||
- NePS's modular design ensures flexibility and extensibility. Integrate NePS effortlessly into existing machine learning workflows. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,101 @@ | ||
# Getting Started | ||
|
||
Getting started with NePS involves a straightforward yet powerful process, centering around its three main components. | ||
This approach ensures flexibility and efficiency in evaluating different architecture and hyperparameter configurations | ||
for your problem. | ||
|
||
## The 3 Main Components | ||
1. **Define a [`run_pipeline`](https://automl.github.io/neps/latest/run_pipeline) Function**: This function is essential | ||
for evaluating different configurations. You'll implement the specific logic for your problem within this function. | ||
For detailed instructions on initializing and effectively using `run_pipeline`, refer to the guide. | ||
|
||
2. **Establish a [`pipeline_space`](https://automl.github.io/neps/latest/pipeline_space)**: Your search space for | ||
defining parameters. You can structure this in various formats, including dictionaries, YAML, or ConfigSpace. | ||
The guide offers insights into defining and configuring your search space. | ||
|
||
3. **Execute with [`neps.run`](https://automl.github.io/neps/latest/neps_run)**: Optimize your `run_pipeline` over | ||
the `pipeline_space` using this function. For a thorough overview of the arguments and their explanations, | ||
check out the detailed documentation. | ||
|
||
By following these steps and utilizing the extensive resources provided in the guides, you can tailor NePS to meet | ||
your specific requirements, ensuring a streamlined and effective optimization process. | ||
|
||
## Basic Usage | ||
In code, the usage pattern can look like this: | ||
|
||
```python | ||
import neps | ||
import logging | ||
|
||
|
||
# 1. Define a function that accepts hyperparameters and computes the validation error | ||
def run_pipeline( | ||
hyperparameter_a: float, hyperparameter_b: int, architecture_parameter: str | ||
) -> dict: | ||
# insert here your own model | ||
model = MyModel(architecture_parameter) | ||
|
||
# insert here your training/evaluation pipeline | ||
validation_error, training_error = train_and_eval( | ||
model, hyperparameter_a, hyperparameter_b | ||
) | ||
|
||
return { # dict or float(validation error) | ||
"loss": validation_error, | ||
"info_dict": { | ||
"training_error": training_error | ||
# + Other metrics | ||
}, | ||
} | ||
|
||
|
||
# 2. Define a search space of the parameters of interest; ensure that the names are consistent with those defined | ||
# in the run_pipeline function | ||
pipeline_space = dict( | ||
hyperparameter_b=neps.IntegerParameter( | ||
lower=1, upper=42, is_fidelity=True | ||
), # Mark 'is_fidelity' as true for a multi-fidelity approach. | ||
hyperparameter_a=neps.FloatParameter( | ||
lower=0.001, upper=0.1, log=True | ||
), # If True, the search space is sampled in log space. | ||
architecture_parameter=neps.CategoricalParameter( | ||
["option_a", "option_b", "option_c"] | ||
), | ||
) | ||
|
||
if __name__ == "__main__": | ||
# 3. Run the NePS optimization | ||
logging.basicConfig(level=logging.INFO) | ||
neps.run( | ||
run_pipeline=run_pipeline, | ||
pipeline_space=pipeline_space, | ||
root_directory="path/to/save/results", # Replace with the actual path. | ||
max_evaluations_total=100, | ||
searcher="hyperband" # Optional specifies the search strategy, | ||
# otherwise NePs decides based on your data. | ||
) | ||
``` | ||
|
||
## Examples | ||
|
||
Discover the features of NePS through these practical examples: | ||
|
||
* **[Hyperparameter Optimization (HPO)]( | ||
https://github.com/automl/neps/blob/master/neps_examples/template/basic_template.py)**: Learn the essentials of | ||
hyperparameter optimization with NePS. | ||
|
||
* **[Architecture Search with Primitives]( | ||
https://github.com/automl/neps/tree/master/neps_examples/basic_usage/architecture.py)**: Dive into architecture search | ||
using primitives in NePS. | ||
|
||
* **[Multi-Fidelity Optimization]( | ||
https://github.com/automl/neps/tree/master/neps_examples/efficiency/multi_fidelity.py)**: Understand how to leverage | ||
multi-fidelity optimization for efficient model tuning. | ||
|
||
* **[Utilizing Expert Priors for Hyperparameters]( | ||
https://github.com/automl/neps/blob/master/neps_examples/template/priorband_template.py)**: | ||
Learn how to incorporate expert priors for more efficient hyperparameter selection. | ||
|
||
* **[Additional NePS Examples]( | ||
https://github.com/automl/neps/tree/master/neps_examples/)**: Explore more examples, including various use cases and | ||
advanced configurations in NePS. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,24 @@ | ||
# Installation | ||
|
||
## Prerequisites | ||
|
||
Ensure you have Python version 3.8, 3.9, 3.10, or 3.11 installed. NePS installation will automatically handle | ||
any additional dependencies via pip. | ||
|
||
## Install from pip | ||
|
||
```bash | ||
pip install neural-pipeline-search | ||
``` | ||
> Note: As indicated with the `v0.x.x` version number, NePS is early stage code and APIs might change in the future. | ||
## Install from source | ||
|
||
!!! note | ||
We use [poetry](https://python-poetry.org/docs/) to manage dependecies. | ||
|
||
```bash | ||
git clone https://github.com/automl/neps.git | ||
cd neps | ||
poetry install --no-dev | ||
``` |
Oops, something went wrong.