Skip to content

Commit

Permalink
update to master
Browse files Browse the repository at this point in the history
  • Loading branch information
danrgll committed Apr 11, 2024
2 parents 14a9189 + cf34bdc commit ece887a
Show file tree
Hide file tree
Showing 37 changed files with 618 additions and 177 deletions.
2 changes: 1 addition & 1 deletion CITATION.cff
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,6 @@ authors:
- family-names: Hutter
given-names: Frank
title: "Neural Pipeline Search (NePS)"
version: 0.11.0
version: 0.11.1
date-released: 2023-10-25
url: "https://github.com/automl/neps"
37 changes: 18 additions & 19 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,13 +25,12 @@ In addition to the common features offered by traditional HPO and NAS libraries,
- NePS is equipped to handle context-free grammar search spaces, providing advanced capabilities for designing and optimizing architectures. this is leveraged by the insights presented in:
- [Construction of Hierarchical Neural Architecture Search Spaces based on Context-free Grammars](https://arxiv.org/abs/2211.01842)

3. [**Easy Parallelization:**](docs/parallelization.md)
- NePS simplifies the parallelization of optimization tasks. Whether experiments are running on a single machine or a distributed computing environment.
3. [**Easy Parallelization and Resumption of Runs:**](docs/parallelization.md)
- NePS simplifies the process of parallelizing optimization tasks both on individual computers and in distributed
computing environments. It also allows users to conveniently resume these optimization tasks after completion to
ensure a seamless and efficient workflow for long-running experiments.

4. [**Resume Runs After Termination:**](docs/parallelization.md)
- NePS allows users to easily resume optimization runs after termination, providing a convenient and efficient workflow for long-running experiments.

5. [**Seamless User Code Integration:**](neps_examples/template/)
4. [**Seamless User Code Integration:**](neps_examples/template/)
- NePS's modular design ensures flexibility and extensibility. Integrate NePS effortlessly into existing machine learning workflows.

## Getting Started
Expand All @@ -50,9 +49,10 @@ pip install neural-pipeline-search

Using `neps` always follows the same pattern:

1. Define a `run_pipeline` function that evaluates architectures/hyperparameters for your problem
1. Define a search space `pipeline_space` of architectures/hyperparameters
1. Call `neps.run` to optimize `run_pipeline` over `pipeline_space`
1. Define a `run_pipeline` function capable of evaluating different architectural and/or hyperparameter configurations
for your problem.
2. Define a search space named `pipeline_space` of those Parameters e.g. via a dictionary
3. Call `neps.run` to optimize `run_pipeline` over `pipeline_space`

In code, the usage pattern can look like this:

Expand All @@ -69,20 +69,20 @@ def run_pipeline(
model = MyModel(architecture_parameter)

# Train and evaluate the model with your training pipeline
validation_error, test_error = train_and_eval(
validation_error, training_error = train_and_eval(
model, hyperparameter_a, hyperparameter_b
)

return { # dict or float(validation error)
"loss": validation_error,
"info_dict": {
"test_error": test_error
"training_error": training_error
# + Other metrics
},
}


# 2. Define a search space of hyperparameters; use the same names as in run_pipeline
# 2. Define a search space of parameters; use the same names for the parameters as in run_pipeline
pipeline_space = dict(
hyperparameter_b=neps.IntegerParameter(
lower=1, upper=42, is_fidelity=True
Expand Down Expand Up @@ -111,20 +111,19 @@ if __name__ == "__main__":
## Examples

Discover how NePS works through these practical examples:
* **[Pipeline Space via YAML](neps_examples/basic_usage/defining_search_space)**: Explore how to define the `pipeline_space` using a
YAML file instead of a dictionary.

* **Hyperparameter Optimization (HPO)**: Learn the essentials of hyperparameter optimization with NePS. [View Example](neps_examples/basic_usage/hyperparameters.py)

* **Defining Search Space with YAML**: Explore how to define the search space for your neural network models using a YAML file. [View Example](neps_examples/basic_usage/defining_search_space)
* **[Hyperparameter Optimization (HPO)](neps_examples/basic_usage/hyperparameters.py)**: Learn the essentials of hyperparameter optimization with NePS.

* **Architecture Search with Primitives**: Dive into architecture search using primitives in NePS. [View Example](neps_examples/basic_usage/architecture.py)
* **[Architecture Search with Primitives](neps_examples/basic_usage/architecture.py)**: Dive into architecture search using primitives in NePS.

* **Multi-Fidelity Optimization**: Understand how to leverage multi-fidelity optimization for efficient model tuning. [View Example](neps_examples/efficiency/multi_fidelity.py)
* **[Multi-Fidelity Optimization](neps_examples/efficiency/multi_fidelity.py)**: Understand how to leverage multi-fidelity optimization for efficient model tuning.

* **Utilizing Expert Priors for Hyperparameters**: Learn how to incorporate expert priors for more efficient hyperparameter selection. [View Example](neps_examples/efficiency/expert_priors_for_hyperparameters.py)
* **[Utilizing Expert Priors for Hyperparameters](neps_examples/efficiency/expert_priors_for_hyperparameters.py)**: Learn how to incorporate expert priors for more efficient hyperparameter selection.

* **[Additional NePS Examples](neps_examples/)**: Explore more examples, including various use cases and advanced configurations in NePS.


## Documentation

For more details and features please have a look at our [documentation](https://automl.github.io/neps/latest/)
Expand Down
40 changes: 27 additions & 13 deletions docs/README.md
Original file line number Diff line number Diff line change
@@ -1,18 +1,32 @@
# Installation
# Neural Pipeline Search (NePS)

## Install from pip
[![PyPI version](https://img.shields.io/pypi/v/neural-pipeline-search?color=informational)](https://pypi.org/project/neural-pipeline-search/)
[![Python versions](https://img.shields.io/pypi/pyversions/neural-pipeline-search)](https://pypi.org/project/neural-pipeline-search/)
[![License](https://img.shields.io/pypi/l/neural-pipeline-search?color=informational)](LICENSE)
[![Tests](https://github.com/automl/neps/actions/workflows/tests.yaml/badge.svg)](https://github.com/automl/neps/actions)

```bash
pip install neural-pipeline-search
```
Welcome to NePS, a powerful and flexible Python library for hyperparameter optimization (HPO) and neural architecture search (NAS) with its primary goal: enable HPO adoption in practice for deep learners!

## Install from source
NePS houses recently published and some more well-established algorithms that are all capable of being run massively parallel on any distributed setup, with tools to analyze runs, restart runs, etc.

!!! note
We use [poetry](https://python-poetry.org/docs/) to manage dependecies.

```bash
git clone https://github.com/automl/neps.git
cd neps
poetry install --no-dev
```
## Key Features

In addition to the common features offered by traditional HPO and NAS libraries, NePS stands out with the following key features:

1. [**Hyperparameter Optimization (HPO) With Prior Knowledge:**](https://github.com/automl/neps/tree/master/neps_examples/template/priorband_template.py)
- NePS excels in efficiently tuning hyperparameters using algorithms that enable users to make use of their prior knowledge within the search space. This is leveraged by the insights presented in:
- [PriorBand: Practical Hyperparameter Optimization in the Age of Deep Learning](https://arxiv.org/abs/2306.12370)
- [πBO: Augmenting Acquisition Functions with User Beliefs for Bayesian Optimization](https://arxiv.org/abs/2204.11051)

2. [**Neural Architecture Search (NAS) With Context-free Grammar Search Spaces:**](https://github.com/automl/neps/tree/master/neps_examples/basic_usage/architecture.py)
- NePS is equipped to handle context-free grammar search spaces, providing advanced capabilities for designing and optimizing architectures. this is leveraged by the insights presented in:
- [Construction of Hierarchical Neural Architecture Search Spaces based on Context-free Grammars](https://arxiv.org/abs/2211.01842)

3. [**Easy Parallelization and Resumption of Runs:**](https://automl.github.io/neps/latest/parallelization)
- NePS simplifies the process of parallelizing optimization tasks both on individual computers and in distributed
computing environments. It also allows users to conveniently resume these optimization tasks after completion to
ensure a seamless and efficient workflow for long-running experiments.

4. [**Seamless User Code Integration:**](https://github.com/automl/neps/tree/master/neps_examples/template/)
- NePS's modular design ensures flexibility and extensibility. Integrate NePS effortlessly into existing machine learning workflows.
36 changes: 29 additions & 7 deletions docs/analyse.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,28 @@ ROOT_DIRECTORY
├── best_loss_trajectory.txt
└── best_loss_with_config_trajectory.txt
```
## Summary CSV

The argument `post_run_summary` in `neps.run` allows for the automatic generation of CSV files after a run is complete. The new root directory after utilizing this argument will look like the following:

```
ROOT_DIRECTORY
├── results
│ └── config_1
│ ├── config.yaml
│ ├── metadata.yaml
│ └── result.yaml
├── summary_csv
│ ├── config_data.csv
│ └── run_status.csv
├── all_losses_and_configs.txt
├── best_loss_trajectory.txt
└── best_loss_with_config_trajectory.txt
```

- *`config_data.csv`*: Contains all configuration details in CSV format, ordered by ascending `loss`. Details include configuration hyperparameters, any returned result from the `run_pipeline` function, and metadata information.

- *`run_status.csv`*: Provides general run details, such as the number of sampled configs, best configs, number of failed configs, best loss, etc.

## TensorBoard Integration

Expand All @@ -37,19 +59,19 @@ The `tblogger.log` function is invoked within the model's training loop to facil
tblogger.log(
loss: float,
current_epoch: int,
write_summary_incumbent: bool = False,
write_config_scalar: bool = False,
write_config_hparam: bool = True,
write_summary_incumbent: bool = False,
extra_data: dict | None = None
)
```

- **Parameters:**
- `loss` (float): The loss value to be logged.
- `current_epoch` (int): The current epoch or iteration number.
- `write_summary_incumbent` (bool, optional): Set to `True` for a live incumbent trajectory.
- `write_config_scalar` (bool, optional): Set to `True` for a live loss trajectory for each configuration.
- `write_config_hparam` (bool, optional): Set to `True` for live parallel coordinate, scatter plot matrix, and table view.
- `write_summary_incumbent` (bool, optional): Set to `True` for a live incumbent trajectory.
- `extra_data` (dict, optional): Additional data to be logged, provided as a dictionary.

### Extra Custom Logging
Expand Down Expand Up @@ -104,15 +126,15 @@ You can find this example [here](https://github.com/automl/neps/blob/master/neps
!!! info "Important"
We have optimized the example for computational efficiency. If you wish to replicate the exact results showcased in the following section, we recommend the following modifications:

1- Increase maximum epochs [here](https://github.com/automl/neps/blob/master/neps_examples/convenience/neps_tblogger_tutorial.py#L260) from 2 to 10
1- Increase maximum epochs from 2 to 10

2- Set the `write_summary_incumbent` argument [here](https://github.com/automl/neps/blob/master/neps_examples/convenience/neps_tblogger_tutorial.py#L300) to `True`
2- Set the `write_summary_incumbent` argument to `True`

3- Change the searcher [here](https://github.com/automl/neps/blob/master/neps_examples/convenience/neps_tblogger_tutorial.py#L357) from `random_search` to `bayesian_optimization`
3- Change the searcher from `random_search` to `bayesian_optimization`

4- Increase the maximum evaluations [here](https://github.com/automl/neps/blob/master/neps_examples/convenience/neps_tblogger_tutorial.py#L362) from 2 to 14
4- Increase the maximum evaluations before disabling `tblogger` from 2 to 14

5- Increase the maximum evaluations [here](https://github.com/automl/neps/blob/master/neps_examples/convenience/neps_tblogger_tutorial.py#L391) from 3 to 15
5- Increase the maximum evaluations after disabling `tblogger` from 3 to 15

### Visualization Results

Expand Down
22 changes: 12 additions & 10 deletions docs/citations.md
Original file line number Diff line number Diff line change
@@ -1,31 +1,33 @@
# Citation of The Software
# Citations

## Citation of The Software

For citing NePS, please refer to the following:

## APA Style
### APA Style

```apa
Stoll, D., Mallik, N., Schrodi, S., Janowski, M., Garibov, S., Abou Chakra, T., Hvarfner, C., Bergman, E., Binxin, R., Kober, N., Vallaeys, T., & Hutter, F. (2023). Neural Pipeline Search (NePS) (Version 0.10.0) [Computer software]. https://github.com/automl/neps
Stoll, D., Mallik, N., Schrodi, S., Janowski, M., Garibov, S., Abou Chakra, T., Rogalla, D., Bergman, E., Hvarfner, C., Binxin, R., Kober, N., Vallaeys, T., & Hutter, F. (2023). Neural Pipeline Search (NePS) (Version 0.11.0) [Computer software]. https://github.com/automl/neps
```

## BibTex Style
### BibTex Style

```bibtex
@software{Stoll_Neural_Pipeline_Search_2023,
author = {Stoll, Danny and Mallik, Neeratyoy and Schrodi, Simon and Janowski, Maciej and Garibov, Samir and Abou Chakra, Tarek and Hvarfner, Carl and Bergman, Eddie and Binxin, Ru and Kober, Nils and Vallaeys, Théophane and Hutter, Frank},
author = {Stoll, Danny and Mallik, Neeratyoy and Schrodi, Simon and Janowski, Maciej and Garibov, Samir and Abou Chakra, Tarek and Rogalla, Daniel and Bergman, Eddie and Hvarfner, Carl and Binxin, Ru and Kober, Nils and Vallaeys, Théophane and Hutter, Frank},
month = oct,
title = {{Neural Pipeline Search (NePS)}},
url = {https://github.com/automl/neps},
version = {0.10.0},
version = {0.11.0},
year = {2023}
}
```

# Citation of Papers
## Citation of Papers

If you have used [PriorBand](https://openreview.net/forum?id=uoiwugtpCH) as the optimizer, please use the bibtex below:
### PriorBand

## PriorBand
If you have used [PriorBand](https://openreview.net/forum?id=uoiwugtpCH) as the optimizer, please use the bibtex below:

```bibtex
@inproceedings{mallik2023priorband,
Expand All @@ -37,7 +39,7 @@ keywords = {}
}
```

## Hierarchichal NAS with Context-free Grammars
### Hierarchichal NAS with Context-free Grammars

If you have used the context-free grammar search space and the graph kernels implemented in NePS for the paper [Hierarchical NAS](https://openreview.net/forum?id=Hpt1i5j6wh), please use the bibtex below:

Expand Down
Binary file modified docs/doc_images/tensorboard/tblogger_hparam1.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/doc_images/tensorboard/tblogger_hparam2.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/doc_images/tensorboard/tblogger_hparam3.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/doc_images/tensorboard/tblogger_image.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/doc_images/tensorboard/tblogger_scalar.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
101 changes: 101 additions & 0 deletions docs/getting_started.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,101 @@
# Getting Started

Getting started with NePS involves a straightforward yet powerful process, centering around its three main components.
This approach ensures flexibility and efficiency in evaluating different architecture and hyperparameter configurations
for your problem.

## The 3 Main Components
1. **Define a [`run_pipeline`](https://automl.github.io/neps/latest/run_pipeline) Function**: This function is essential
for evaluating different configurations. You'll implement the specific logic for your problem within this function.
For detailed instructions on initializing and effectively using `run_pipeline`, refer to the guide.

2. **Establish a [`pipeline_space`](https://automl.github.io/neps/latest/pipeline_space)**: Your search space for
defining parameters. You can structure this in various formats, including dictionaries, YAML, or ConfigSpace.
The guide offers insights into defining and configuring your search space.

3. **Execute with [`neps.run`](https://automl.github.io/neps/latest/neps_run)**: Optimize your `run_pipeline` over
the `pipeline_space` using this function. For a thorough overview of the arguments and their explanations,
check out the detailed documentation.

By following these steps and utilizing the extensive resources provided in the guides, you can tailor NePS to meet
your specific requirements, ensuring a streamlined and effective optimization process.

## Basic Usage
In code, the usage pattern can look like this:

```python
import neps
import logging


# 1. Define a function that accepts hyperparameters and computes the validation error
def run_pipeline(
hyperparameter_a: float, hyperparameter_b: int, architecture_parameter: str
) -> dict:
# insert here your own model
model = MyModel(architecture_parameter)

# insert here your training/evaluation pipeline
validation_error, training_error = train_and_eval(
model, hyperparameter_a, hyperparameter_b
)

return { # dict or float(validation error)
"loss": validation_error,
"info_dict": {
"training_error": training_error
# + Other metrics
},
}


# 2. Define a search space of the parameters of interest; ensure that the names are consistent with those defined
# in the run_pipeline function
pipeline_space = dict(
hyperparameter_b=neps.IntegerParameter(
lower=1, upper=42, is_fidelity=True
), # Mark 'is_fidelity' as true for a multi-fidelity approach.
hyperparameter_a=neps.FloatParameter(
lower=0.001, upper=0.1, log=True
), # If True, the search space is sampled in log space.
architecture_parameter=neps.CategoricalParameter(
["option_a", "option_b", "option_c"]
),
)

if __name__ == "__main__":
# 3. Run the NePS optimization
logging.basicConfig(level=logging.INFO)
neps.run(
run_pipeline=run_pipeline,
pipeline_space=pipeline_space,
root_directory="path/to/save/results", # Replace with the actual path.
max_evaluations_total=100,
searcher="hyperband" # Optional specifies the search strategy,
# otherwise NePs decides based on your data.
)
```

## Examples

Discover the features of NePS through these practical examples:

* **[Hyperparameter Optimization (HPO)](
https://github.com/automl/neps/blob/master/neps_examples/template/basic_template.py)**: Learn the essentials of
hyperparameter optimization with NePS.

* **[Architecture Search with Primitives](
https://github.com/automl/neps/tree/master/neps_examples/basic_usage/architecture.py)**: Dive into architecture search
using primitives in NePS.

* **[Multi-Fidelity Optimization](
https://github.com/automl/neps/tree/master/neps_examples/efficiency/multi_fidelity.py)**: Understand how to leverage
multi-fidelity optimization for efficient model tuning.

* **[Utilizing Expert Priors for Hyperparameters](
https://github.com/automl/neps/blob/master/neps_examples/template/priorband_template.py)**:
Learn how to incorporate expert priors for more efficient hyperparameter selection.

* **[Additional NePS Examples](
https://github.com/automl/neps/tree/master/neps_examples/)**: Explore more examples, including various use cases and
advanced configurations in NePS.
24 changes: 24 additions & 0 deletions docs/installation.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
# Installation

## Prerequisites

Ensure you have Python version 3.8, 3.9, 3.10, or 3.11 installed. NePS installation will automatically handle
any additional dependencies via pip.

## Install from pip

```bash
pip install neural-pipeline-search
```
> Note: As indicated with the `v0.x.x` version number, NePS is early stage code and APIs might change in the future.
## Install from source

!!! note
We use [poetry](https://python-poetry.org/docs/) to manage dependecies.

```bash
git clone https://github.com/automl/neps.git
cd neps
poetry install --no-dev
```
Loading

0 comments on commit ece887a

Please sign in to comment.