From 52425f6e477411a28c44238127a0c03ea5c5e078 Mon Sep 17 00:00:00 2001 From: = Date: Wed, 3 Jul 2024 10:26:13 +0200 Subject: [PATCH] Improve README.md and update docs landing page --- README.md | 42 +++++--------- docs/dev_docs/roadmap.md | 24 +------- docs/index.md | 120 +++++++++++++++------------------------ 3 files changed, 64 insertions(+), 122 deletions(-) diff --git a/README.md b/README.md index 9153d1e8..06d8fb8d 100644 --- a/README.md +++ b/README.md @@ -5,9 +5,9 @@ [![License](https://img.shields.io/pypi/l/neural-pipeline-search?color=informational)](LICENSE) [![Tests](https://github.com/automl/neps/actions/workflows/tests.yaml/badge.svg)](https://github.com/automl/neps/actions) -Welcome to NePS, a powerful and flexible Python library for hyperparameter optimization (HPO) and neural architecture search (NAS) with its primary goal: enable HPO and NAS for deep learners! +Welcome to NePS, a powerful and flexible Python library for hyperparameter optimization (HPO) and neural architecture search (NAS) with its primary goal: **make HPO and NAS usable for deep learners in practice**. -NePS houses recently published and also well-established algorithms that can all be run massively parallel on distributed setups, with tools to analyze runs, restart runs, etc., all tailored to the needs of deep learning experts. +NePS houses recently published and also well-established algorithms that can all be run massively parallel on distributed setups, with tools to analyze runs, restart runs, etc., all **tailored to the needs of deep learning experts**. Take a look at our [documentation](https://automl.github.io/neps/latest/) for all the details on how to use NePS! @@ -15,26 +15,22 @@ Take a look at our [documentation](https://automl.github.io/neps/latest/) for al In addition to the features offered by traditional HPO and NAS libraries, NePS, e.g., stands out with: -1. [**Hyperparameter Optimization (HPO) With Prior Knowledge:**](neps_examples/template/priorband_template.py) +1. [**Hyperparameter Optimization (HPO) with Prior Knowledge and Cheap Proxies:**](neps_examples/template/priorband_template.py) - NePS excels in efficiently tuning hyperparameters using algorithms that enable users to make use of their prior knowledge within the search space. This is leveraged by the insights presented in: - [PriorBand: Practical Hyperparameter Optimization in the Age of Deep Learning](https://arxiv.org/abs/2306.12370) - [πBO: Augmenting Acquisition Functions with User Beliefs for Bayesian Optimization](https://arxiv.org/abs/2204.11051) -1. [**Neural Architecture Search (NAS) With Context-free Grammar Search Spaces:**](neps_examples/basic_usage/architecture.py) +1. [**Neural Architecture Search (NAS) with General Search Spaces:**](neps_examples/basic_usage/architecture.py) - NePS is equipped to handle context-free grammar search spaces, providing advanced capabilities for designing and optimizing architectures. this is leveraged by the insights presented in: - [Construction of Hierarchical Neural Architecture Search Spaces based on Context-free Grammars](https://arxiv.org/abs/2211.01842) -1. [**Easy Parallelization and Resumption of Runs:**](https://automl.github.io/neps/latest/examples/efficiency/) +1. [**Easy Parallelization and Tailored to DL:**](https://automl.github.io/neps/latest/examples/efficiency/) - NePS simplifies the process of parallelizing optimization tasks both on individual computers and in distributed - computing environments. It also allows users to conveniently resume these optimization tasks after completion to - ensure a seamless and efficient workflow for long-running experiments. - -1. [**Seamless User Code Integration:**](neps_examples/template/) - - - NePS's modular design ensures flexibility and extensibility. Integrate NePS effortlessly into existing machine learning workflows. + computing environments. As NePS is made for deep learners, all technical choices are made with DL in mind and common + DL tools such as Tensorboard are [embraced](https://automl.github.io/neps/latest/reference/analyse/#visualizing-results). ## Installation @@ -44,15 +40,12 @@ To install the latest release from PyPI run pip install neural-pipeline-search ``` -To get the latest version from github run +To get the latest version from Github run ```bash pip install git+https://github.com/automl/neps.git ``` -> Note: As indicated with the `v0.x.x` version number APIs will change in the future. - - ## Basic Usage Using `neps` always follows the same pattern: @@ -77,17 +70,10 @@ def run_pipeline( model = MyModel(architecture_parameter) # Train and evaluate the model with your training pipeline - validation_error, training_error = train_and_eval( + validation_error = train_and_eval( model, hyperparameter_a, hyperparameter_b ) - - return { # dict or float(validation error) - "loss": validation_error, - "info_dict": { - "training_error": training_error - # + Other metrics - }, - } + return validation_error # 2. Define a search space of parameters; use the same parameter names as in run_pipeline @@ -112,16 +98,16 @@ neps.run( ## Examples -Discover how NePS works through these practical examples: - -- **[Hyperparameter Optimization (HPO)](neps_examples/basic_usage/hyperparameters.py)**: Learn the essentials of hyperparameter optimization with NePS. +Discover how NePS works through these examples: -- **[Architecture Search with Primitives](neps_examples/basic_usage/architecture.py)**: Dive into architecture search using primitives in NePS. +- **[Hyperparameter Optimization](neps_examples/basic_usage/hyperparameters.py)**: Learn the essentials of hyperparameter optimization with NePS. - **[Multi-Fidelity Optimization](neps_examples/efficiency/multi_fidelity.py)**: Understand how to leverage multi-fidelity optimization for efficient model tuning. - **[Utilizing Expert Priors for Hyperparameters](neps_examples/efficiency/expert_priors_for_hyperparameters.py)**: Learn how to incorporate expert priors for more efficient hyperparameter selection. +- **[Architecture Search](neps_examples/basic_usage/architecture.py)**: Dive into (hierarchical) architecture search in NePS. + - **[Additional NePS Examples](neps_examples/)**: Explore more examples, including various use cases and advanced configurations in NePS. ## Contributing diff --git a/docs/dev_docs/roadmap.md b/docs/dev_docs/roadmap.md index da69b350..c8bf9510 100644 --- a/docs/dev_docs/roadmap.md +++ b/docs/dev_docs/roadmap.md @@ -7,6 +7,7 @@ - Improve handling of multi-fidelity for large scale (slurm script modification) - Evaluate and maybe improve ease-of-use of NePS and DDP etc. - Optimize dependencies +- Improved examples ### Fixes @@ -24,12 +25,6 @@ ### Documentation - Keep citations doc up to date -- Role of analysing runs needs to be higher in docs -- Explain what optimizers are run per default / papers higher in docs -- Rework README.md - - Rethink key features. Who is reading this? Mention multi-fidelity / scaling algorithmis? - - Code example of readme should work when copied - - Keep README synced with docs landingpage more nicely ### Tests @@ -40,7 +35,7 @@ ### Features -- Generate plot after each evaluation +- Generate pdf plot after each evaluation - Finegrained control over user prior - Print search space upon run - Utility to generate code for best architecture @@ -55,16 +50,13 @@ - Improve neps.optimizers: - Maintained vs unmaintained optimizers - Remove unnecessary / broken optimizers + - Merge GP and hierarchical GP - Break up search space and config aspect ### Documentation - NAS documentation -### Tests - -- Regression tests to run on each push - ## After 1.0.0 ### Features @@ -75,13 +67,3 @@ ### Documentation - Keep a changelog - - -## Rethink - -- Log priors include again -- Allow yaml based input of search space and the target function source to `neps.run` -- Support conditionals in ConfigSpace search space -- Support logging of optimizer state details -- Merge GP and hierarchical GP -- Generate analysis pdf diff --git a/docs/index.md b/docs/index.md index 69ba0ce2..4dc988c8 100644 --- a/docs/index.md +++ b/docs/index.md @@ -5,30 +5,30 @@ [![License](https://img.shields.io/pypi/l/neural-pipeline-search?color=informational)](LICENSE) [![Tests](https://github.com/automl/neps/actions/workflows/tests.yaml/badge.svg)](https://github.com/automl/neps/actions) -Welcome to NePS, a powerful and flexible Python library for hyperparameter optimization (HPO) and neural architecture search (NAS) with its primary goal: enable HPO adoption in practice for deep learners! +Welcome to NePS, a powerful and flexible Python library for hyperparameter optimization (HPO) and neural architecture search (NAS) with its primary goal: **make HPO and NAS usable for deep learners in practice**. -NePS houses recently published and some more well-established algorithms that are all capable of being run massively parallel on any distributed setup, with tools to analyze runs, restart runs, etc. +NePS houses recently published and also well-established algorithms that can all be run massively parallel on distributed setups, with tools to analyze runs, restart runs, etc., all **tailored to the needs of deep learning experts**. ## Key Features -In addition to the common features offered by traditional HPO and NAS libraries, NePS stands out with the following key features: +In addition to the features offered by traditional HPO and NAS libraries, NePS, e.g., stands out with: -1. [**Hyperparameter Optimization (HPO) With Prior Knowledge:**](./examples/template/priorband_template.md) - - NePS excels in efficiently tuning hyperparameters using algorithms that enable users to make use of their prior knowledge within the search space. This is leveraged by the insights presented in: - - [PriorBand: Practical Hyperparameter Optimization in the Age of Deep Learning](https://arxiv.org/abs/2306.12370) - - [πBO: Augmenting Acquisition Functions with User Beliefs for Bayesian Optimization](https://arxiv.org/abs/2204.11051) +1. [**Hyperparameter Optimization (HPO) with Prior Knowledge and Cheap Proxies:**](./examples/template/priorband_template.py) -2. [**Neural Architecture Search (NAS) With Context-free Grammar Search Spaces:**](./examples/basic_usage/architecture.md) - - NePS is equipped to handle context-free grammar search spaces, providing advanced capabilities for designing and optimizing architectures. this is leveraged by the insights presented in: - - [Construction of Hierarchical Neural Architecture Search Spaces based on Context-free Grammars](https://arxiv.org/abs/2211.01842) + - NePS excels in efficiently tuning hyperparameters using algorithms that enable users to make use of their prior knowledge within the search space. This is leveraged by the insights presented in: + - [PriorBand: Practical Hyperparameter Optimization in the Age of Deep Learning](https://arxiv.org/abs/2306.12370) + - [πBO: Augmenting Acquisition Functions with User Beliefs for Bayesian Optimization](https://arxiv.org/abs/2204.11051) -3. **Easy Parallelization and Resumption of Runs:** - - NePS simplifies the process of parallelizing optimization tasks both on individual computers and in distributed - computing environments. It also allows users to conveniently resume these optimization tasks after completion to - ensure a seamless and efficient workflow for long-running experiments. +1. [**Neural Architecture Search (NAS) with General Search Spaces:**](./examples/basic_usage/architecture.py) -4. [**Seamless User Code Integration:**](./examples/index.md) - - NePS's modular design ensures flexibility and extensibility. Integrate NePS effortlessly into existing machine learning workflows. + - NePS is equipped to handle context-free grammar search spaces, providing advanced capabilities for designing and optimizing architectures. this is leveraged by the insights presented in: + - [Construction of Hierarchical Neural Architecture Search Spaces based on Context-free Grammars](https://arxiv.org/abs/2211.01842) + +1. [**Easy Parallelization and Tailored to DL:**](https://automl.github.io/neps/latest/examples/efficiency/) + + - NePS simplifies the process of parallelizing optimization tasks both on individual computers and in distributed + computing environments. As NePS is made for deep learners, all technical choices are made with DL in mind and common + DL tools such as Tensorboard are [embraced](https://automl.github.io/neps/latest/reference/analyse/#visualizing-results). !!! tip @@ -38,33 +38,28 @@ In addition to the common features offered by traditional HPO and NAS libraries, * [API](./api/neps/api.md) for a more detailed reference. * [Examples](./examples/template/basic_template.md) for copy-pastable code to get started. -## Getting Started +## Installation -### 1. Installation -NePS requires Python 3.8 or higher. You can install it via pip or from source. +To install the latest release from PyPI run -Using pip: ```bash pip install neural-pipeline-search ``` -> Note: As indicated with the `v0.x.x` version number, NePS is early stage code and APIs might change in the future. +To get the latest version from Github run -You can install from source by cloning the repository and running: ```bash -git clone git@github.com:automl/neps.git -cd neps -poetry install +pip install git+https://github.com/automl/neps.git ``` -### 2. Basic Usage +## Basic Usage Using `neps` always follows the same pattern: 1. Define a `run_pipeline` function capable of evaluating different architectural and/or hyperparameter configurations for your problem. -2. Define a search space named `pipeline_space` of those Parameters e.g. via a dictionary -3. Call `neps.run` to optimize `run_pipeline` over `pipeline_space` +1. Define a search space named `pipeline_space` of those Parameters e.g. via a dictionary +1. Call `neps.run` to optimize `run_pipeline` over `pipeline_space` In code, the usage pattern can look like this: @@ -81,71 +76,50 @@ def run_pipeline( model = MyModel(architecture_parameter) # Train and evaluate the model with your training pipeline - validation_error, training_error = train_and_eval( + validation_error = train_and_eval( model, hyperparameter_a, hyperparameter_b ) + return validation_error - return { # dict or float(validation error) - "loss": validation_error, - "info_dict": { - "training_error": training_error - # + Other metrics - }, - } - -# 2. Define a search space of parameters; use the same names for the parameters as in run_pipeline +# 2. Define a search space of parameters; use the same parameter names as in run_pipeline pipeline_space = dict( - hyperparameter_b=neps.IntegerParameter( - lower=1, upper=42, is_fidelity=True - ), # Mark 'is_fidelity' as true for a multi-fidelity approach. hyperparameter_a=neps.FloatParameter( - lower=0.001, upper=0.1, log=True - ), # If True, the search space is sampled in log space. - architecture_parameter=neps.CategoricalParameter( - ["option_a", "option_b", "option_c"] + lower=0.001, upper=0.1, log=True # The search space is sampled in log space ), + hyperparameter_b=neps.IntegerParameter(lower=1, upper=42), + architecture_parameter=neps.CategoricalParameter(["option_a", "option_b"]), ) -if __name__ == "__main__": - # 3. Run the NePS optimization - logging.basicConfig(level=logging.INFO) - neps.run( - run_pipeline=run_pipeline, - pipeline_space=pipeline_space, - root_directory="path/to/save/results", # Replace with the actual path. - max_evaluations_total=100, - searcher="hyperband" # Optional specifies the search strategy, - # otherwise NePs decides based on your data. - ) -``` +# 3. Run the NePS optimization +logging.basicConfig(level=logging.INFO) +neps.run( + run_pipeline=run_pipeline, + pipeline_space=pipeline_space, + root_directory="path/to/save/results", # Replace with the actual path. + max_evaluations_total=100, +) +``` ## Examples -Discover how NePS works through these practical examples: +Discover how NePS works through these examples: -* **[Pipeline Space via YAML](./examples/basic_usage/hpo_usage_example.md)**: - Explore how to define the `pipeline_space` using a YAML file instead of a dictionary. +- **[Hyperparameter Optimization](./examples/basic_usage/hyperparameters.py)**: Learn the essentials of hyperparameter optimization with NePS. -* **[Hyperparameter Optimization (HPO)](./examples/basic_usage/hyperparameters.md)**: - Learn the essentials of hyperparameter optimization with NePS. +- **[Multi-Fidelity Optimization](./examples/efficiency/multi_fidelity.py)**: Understand how to leverage multi-fidelity optimization for efficient model tuning. -* **[Architecture Search with Primitives](./examples/basic_usage/architecture.md)**: - Dive into architecture search using primitives in NePS. +- **[Utilizing Expert Priors for Hyperparameters](./examples/efficiency/expert_priors_for_hyperparameters.py)**: Learn how to incorporate expert priors for more efficient hyperparameter selection. -* **[Multi-Fidelity Optimization](./examples/efficiency/multi_fidelity.md)**: - Understand how to leverage multi-fidelity optimization for efficient model tuning. +- **[Architecture Search](./examples/basic_usage/architecture.py)**: Dive into (hierarchical) architecture search in NePS. -* **[Utilizing Expert Priors for Hyperparameters](./examples/efficiency/expert_priors_for_hyperparameters.md)**: - Learn how to incorporate expert priors for more efficient hyperparameter selection. +- **[Additional NePS Examples](./examples/)**: Explore more examples, including various use cases and advanced configurations in NePS. -* **[Additional NePS Examples](./examples/index.md)**: - Explore more examples, including various use cases and advanced configurations in NePS. +## Contributing +Please see the [documentation for contributors](./dev_docs/contributing/). ## Citations -Please consider citing us if you use our tool! - -Refer to our [documentation on citations](./citations.md) +For pointers on citing the NePS package and papers refer to our [documentation on citations](./citations.md).