Skip to content

Commit

Permalink
Version 1.3.2
Browse files Browse the repository at this point in the history
* Added stale bot support.
* If package version 0.0.0 via `get_distribution` is found, the version of the module is used
instead.
* Removed `tox.ini`.
* Moved `requirements.txt` to `setup.py`.
* Added multi-objective support for ROAR.
* Added notes in documentation that `SMAC4MF` is the closest implementation to BOHB/HpBandSter.
  • Loading branch information
renesass authored May 5, 2022
1 parent 9d8f511 commit 2935561
Show file tree
Hide file tree
Showing 18 changed files with 144 additions and 74 deletions.
22 changes: 22 additions & 0 deletions .github/stale.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
# Number of days of inactivity before an issue becomes stale
daysUntilStale: 60

# Number of days of inactivity before a stale issue is closed
daysUntilClose: 7

# Issues with these labels will never be considered stale
exemptLabels:
- pinned
- security

# Label to use when marking an issue as stale
staleLabel: wontfix

# Comment to post when marking an issue as stale. Set to `false` to disable
markComment: >
This issue has been automatically marked as stale because it has not had
recent activity. It will be closed if no further activity occurs. Thank you
for your contributions.
# Comment to post when closing a stale issue. Set to `false` to disable
closeComment: false
File renamed without changes.
11 changes: 10 additions & 1 deletion changelog.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,14 @@
# 1.3.1
# 1.3.2
* Added stale bot support.
* If package version 0.0.0 via `get_distribution` is found, the version of the module is used
instead.
* Removed `tox.ini`.
* Moved `requirements.txt` to `setup.py`.
* Added multi-objective support for ROAR.
* Added notes in documentation that `SMAC4MF` is the closest implementation to BOHB/HpBandSter.


# 1.3.1
* Added Python 3.7 support again.


Expand Down
11 changes: 7 additions & 4 deletions docs/details/callbacks.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,13 +8,16 @@ implemented callbacks is very limited, but they can easily be added.
How to add a new callback
^^^^^^^^^^^^^^^^^^^^^^^^^

#. Implement a callback class in ``smac/callbacks.py``. There are no restrictions on how such a
* Implement a callback class in ``smac/callbacks.py``. There are no restrictions on how such a
callback must look like, but it is recommended to implement the main logic inside the `__call__`
function, such as for example in ``IncorporateRunResultCallback``.
#. Add your callback to ``smac.smbo.optimizer.SMBO._callbacks``, using the name of your callback

* Add your callback to ``smac.smbo.optimizer.SMBO._callbacks``, using the name of your callback
as the key, and an empty list as the value.
#. Add your callback to ``smac.smbo.optimizer.SMBO._callback_to_key``, using the callback class as

* Add your callback to ``smac.smbo.optimizer.SMBO._callback_to_key``, using the callback class as
the key, and the name as value (the name used in 2.).
#. Implement calling all registered callbacks at the correct place. This is as simple as

* Implement calling all registered callbacks at the correct place. This is as simple as
``for callback in self._callbacks['your_callback']: callback(*args, **kwargs)``, where you
obviously need to change the callback name and signature.
6 changes: 6 additions & 0 deletions docs/details/facades.rst
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,12 @@ These recommendations are based on our experience and technical limitations and
Forest<RF>`", ":term:`Random Forest<RF>`, :term:`Gaussian Process<GP>`, :term:`GP-MCMC` or Random"


.. note::

The ``SMAC4MF`` facade is the closest implementation to
`BOHB <https://github.com/automl/HpBandSter>`_.


Inheritance
~~~~~~~~~~~

Expand Down
7 changes: 1 addition & 6 deletions docs/details/multi_objective.rst
Original file line number Diff line number Diff line change
Expand Up @@ -20,12 +20,7 @@ The basic recipe is as follows:
Please set ``run_obj = 'quality'``.
#. Now you can optionally pass a custom multi-objective algorithm class or further kwargs to the SMAC
facade (via ``multi_objective_algorithm`` and/or ``multi_objective_kwargs``).
Per default, ParEgo is used as the multi-objective algorithm.


.. warning::

Multi-Objective Optimization does currently *not* support Intensifications like Hyperband or Successive Halving.
Per default, a mean aggregation strategy is used as the multi-objective algorithm.


We show an example of how to use multi-objective with a nice Pareto front plot in our examples:
Expand Down
4 changes: 4 additions & 0 deletions docs/faq.rst
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,10 @@ Can I restore SMAC from a previous state?
Yes. Have a look :ref:`here<Restoring>`.


How can I use :term:`BOHB` and/or `HpBandSter <https://github.com/automl/HpBandSter>`_ with SMAC?
The facade SMAC4HPO is the closes implementation to :term:`BOHB` and/or `HpBandSter <https://github.com/automl/HpBandSter>`_.


I discovered a bug or SMAC does not behave as expected. Where should I report to?
Open an issue in our issue list on GitHub. Before you report a bug, please make sure that:

Expand Down
8 changes: 8 additions & 0 deletions docs/glossary.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,14 @@ Glossary
Bayesian Optimization. A Black-Box optimization algorithm weighing exploration & exploitation
to find the minimum of its objective.

HB
`Hyperband <https://arxiv.org/abs/1603.06560>`_. A novel bandit-based algorithm for hyperparameter
optimization. Hyperband is an extension of successive halving and therefore works with
multi-fidelities.

BOHB
`Bayesian optimization and Hyperband <https://arxiv.org/abs/1807.01774>`_.

SMAC
Sequential Model-Based Algorithm Configuration.

Expand Down
5 changes: 5 additions & 0 deletions examples/python/plot_mlp_mf.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,11 @@
MLP is a deep neural network, and therefore, we choose epochs as fidelity type. The digits dataset
is chosen to optimize the average accuracy on 5-fold cross validation.
.. note::
This example uses the ``SMAC4MF`` facade, which is the closest implementation to
`BOHB <https://github.com/automl/HpBandSter>`_.
"""

import logging
Expand Down
11 changes: 0 additions & 11 deletions requirements.txt

This file was deleted.

17 changes: 15 additions & 2 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ def read_file(filepath: str) -> str:
"pydocstyle",
"flake8",
"pre-commit",
]
],
}

setuptools.setup(
Expand All @@ -67,8 +67,21 @@ def read_file(filepath: str) -> str:
project_urls=project_urls,
version=version,
packages=setuptools.find_packages(exclude=["tests"]),
include_package_data=True,
python_requires=">=3.7",
install_requires=read_file(os.path.join(HERE, "requirements.txt")).split("\n"),
install_requires=[
"numpy>=1.7.1",
"scipy>=1.7.0",
"psutil",
"pynisher>=0.4.1",
"ConfigSpace>=0.5.0",
"joblib",
"scikit-learn>=0.22.0",
"pyrfr>=0.8.0",
"dask",
"distributed",
"emcee>=3.0.0",
],
extras_require=extras_require,
test_suite="pytest",
platforms=["Linux"],
Expand Down
12 changes: 11 additions & 1 deletion smac/__init__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,7 @@
import datetime
import os
import sys
import warnings

name = "SMAC3"
package_name = "smac"
Expand All @@ -19,4 +22,11 @@
Matthias Feurer, André Biedenkapp, Difan Deng, Carolin Benjamins, Tim Ruhkopf, René Sass
and Frank Hutter
"""
version = "1.3.1"
version = "1.3.2"


if os.name != "posix":
warnings.warn(
f"Detected unsupported operating system: {sys.platform}."
"Please be aware, that SMAC might not run on this system."
)
9 changes: 1 addition & 8 deletions smac/epm/gaussian_process_mcmc.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,14 +4,7 @@
import warnings
from copy import deepcopy

try:
import emcee
except ImportError as e:
raise ImportError(
"Could not import emcee - emcee is an optional dependency.\n"
"Please install it manually with `pip install emcee`."
) from e

import emcee
import numpy as np
from sklearn.gaussian_process import GaussianProcessRegressor
from sklearn.gaussian_process.kernels import Kernel
Expand Down
46 changes: 30 additions & 16 deletions smac/facade/roar_facade.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
import typing
from typing import Callable, Dict, List, Optional, Type, Union

import logging

Expand All @@ -11,6 +11,9 @@
from smac.initial_design.initial_design import InitialDesign
from smac.intensification.abstract_racer import AbstractRacer
from smac.optimizer.ei_optimization import AcquisitionFunctionMaximizer, RandomSearch
from smac.optimizer.multi_objective.abstract_multi_objective_algorithm import (
AbstractMultiObjectiveAlgorithm,
)
from smac.runhistory.runhistory import RunHistory
from smac.runhistory.runhistory2epm import (
AbstractRunHistory2EPM,
Expand Down Expand Up @@ -54,11 +57,18 @@ class ROAR(SMAC4AC):
to perform random search over a fixed set of configurations.
acquisition_function_optimizer_kwargs: Optional[dict]
Arguments passed to constructor of `~acquisition_function_optimizer`
multi_objective_algorithm: Optional[Type["AbstractMultiObjectiveAlgorithm"]]
Class that implements multi objective logic. If None, will use:
smac.optimizer.multi_objective.aggregation_strategy.MeanAggregationStrategy
Multi objective only becomes active if the objective
specified in `~scenario.run_obj` is a List[str] with at least two entries.
multi_objective_kwargs: Optional[Dict]
Arguments passed to `~multi_objective_algorithm`.
initial_design : InitialDesign
initial sampling design
initial_design_kwargs: Optional[dict]
arguments passed to constructor of `~initial_design`
initial_configurations: typing.List[Configuration]
initial_configurations: List[Configuration]
list of initial configurations for initial design --
cannot be used together with initial_design
stats: Stats
Expand Down Expand Up @@ -86,21 +96,23 @@ class ROAR(SMAC4AC):
def __init__(
self,
scenario: Scenario,
tae_runner: typing.Optional[typing.Union[typing.Type[BaseRunner], typing.Callable]] = None,
tae_runner_kwargs: typing.Optional[typing.Dict] = None,
tae_runner: Optional[Union[Type[BaseRunner], Callable]] = None,
tae_runner_kwargs: Optional[Dict] = None,
runhistory: RunHistory = None,
intensifier: typing.Optional[typing.Type[AbstractRacer]] = None,
intensifier_kwargs: typing.Optional[typing.Dict] = None,
acquisition_function_optimizer: typing.Optional[typing.Type[AcquisitionFunctionMaximizer]] = None,
acquisition_function_optimizer_kwargs: typing.Optional[dict] = None,
initial_design: typing.Optional[typing.Type[InitialDesign]] = None,
initial_design_kwargs: typing.Optional[dict] = None,
initial_configurations: typing.List[Configuration] = None,
intensifier: Optional[Type[AbstractRacer]] = None,
intensifier_kwargs: Optional[Dict] = None,
acquisition_function_optimizer: Optional[Type[AcquisitionFunctionMaximizer]] = None,
acquisition_function_optimizer_kwargs: Optional[dict] = None,
multi_objective_algorithm: Optional[Type[AbstractMultiObjectiveAlgorithm]] = None,
multi_objective_kwargs: Optional[Dict] = None,
initial_design: Optional[Type[InitialDesign]] = None,
initial_design_kwargs: Optional[dict] = None,
initial_configurations: List[Configuration] = None,
stats: Stats = None,
rng: typing.Optional[typing.Union[int, np.random.RandomState]] = None,
run_id: typing.Optional[int] = None,
dask_client: typing.Optional[dask.distributed.Client] = None,
n_jobs: typing.Optional[int] = 1,
rng: Optional[Union[int, np.random.RandomState]] = None,
run_id: Optional[int] = None,
dask_client: Optional[dask.distributed.Client] = None,
n_jobs: Optional[int] = 1,
):
self.logger = logging.getLogger(self.__module__ + "." + self.__class__.__name__)

Expand All @@ -111,7 +123,7 @@ def __init__(

if scenario.run_obj == "runtime":
# We need to do this to be on the same scale for imputation (although we only impute with a Random EPM)
runhistory2epm = RunHistory2EPM4LogCost # type: typing.Type[AbstractRunHistory2EPM]
runhistory2epm = RunHistory2EPM4LogCost # type: Type[AbstractRunHistory2EPM]
else:
runhistory2epm = RunHistory2EPM4Cost

Expand All @@ -124,6 +136,8 @@ def __init__(
intensifier=intensifier,
intensifier_kwargs=intensifier_kwargs,
runhistory2epm=runhistory2epm,
multi_objective_algorithm=multi_objective_algorithm,
multi_objective_kwargs=multi_objective_kwargs,
initial_design=initial_design,
initial_design_kwargs=initial_design_kwargs,
initial_configurations=initial_configurations,
Expand Down
1 change: 0 additions & 1 deletion smac/requirements.txt

This file was deleted.

8 changes: 7 additions & 1 deletion smac/utils/dependencies.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
import importlib
import re

import pkg_resources
import pkg_resources # type: ignore
from packaging.version import Version

__copyright__ = "Copyright 2021, AutoML.org Freiburg-Hannover"
Expand Down Expand Up @@ -83,6 +83,12 @@ def _verify_package(name: str, operation: str, version: str) -> None:
if not operation:
return

# pkg_resources.get_distribution can (not) find a version depending on how the package was built
# if we get version 0.0.0 we fallback to the module's version
if installed_version == Version("0.0.0"):
module = importlib.import_module(name)
installed_version = Version(module.__version__)

required_version = Version(version)

if operation == "==":
Expand Down
29 changes: 17 additions & 12 deletions tests/test_multi_objective/test_schaffer.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@
from smac.facade.smac_ac_facade import SMAC4AC
from smac.facade.smac_bb_facade import SMAC4BB
from smac.facade.smac_hpo_facade import SMAC4HPO
from smac.facade.roar_facade import ROAR
from smac.optimizer.multi_objective.parego import ParEGO
from smac.scenario.scenario import Scenario

Expand Down Expand Up @@ -72,7 +73,7 @@ def setUp(self):
self.scenario = Scenario(
{
"run_obj": "quality", # we optimize quality (alternatively runtime)
"runcount-limit": 50, # max. number of function evaluations
"runcount-limit": 20, # max. number of function evaluations
"cs": self.cs, # configuration space
"deterministic": True,
"multi_objectives": "metric1, metric2",
Expand All @@ -82,29 +83,33 @@ def setUp(self):

self.facade_kwargs = {
"scenario": self.scenario,
"rng": np.random.RandomState(5),
"rng": np.random.RandomState(0),
"tae_runner": tae,
}

self.parego_facade_kwargs = {
"scenario": self.scenario,
"rng": np.random.RandomState(5),
"rng": np.random.RandomState(0),
"tae_runner": tae,
"multi_objective_algorithm": ParEGO,
"multi_objective_kwargs": {"rho": 0.05},
}

def test_facades(self):
results = []
for facade in [SMAC4BB, SMAC4HPO, SMAC4AC]:
smac = facade(**self.facade_kwargs)
incumbent = smac.optimize()

f1_inc, f2_inc = schaffer(incumbent["x"])
f1_opt, f2_opt = get_optimum()

self.assertAlmostEqual(f1_inc + f2_inc, f1_opt + f2_opt, places=1)
results.append(smac)
for facade in [ROAR, SMAC4BB, SMAC4HPO, SMAC4AC]:
for kwargs in [self.facade_kwargs, self.parego_facade_kwargs]:
smac = facade(**kwargs)
incumbent = smac.optimize()

f1_inc, f2_inc = schaffer(incumbent["x"])
f1_opt, f2_opt = get_optimum()
inc = f1_inc + f2_inc
opt = f1_opt + f2_opt
diff = abs(inc - opt)

assert diff < 0.1
results.append(smac)

return results

Expand Down
11 changes: 0 additions & 11 deletions tox.ini

This file was deleted.

0 comments on commit 2935561

Please sign in to comment.