Skip to content

Optimization Methods

arturluis edited this page Oct 9, 2020 · 22 revisions

HyperMapper currently supports three different optimization methods: Bayesian Optimization, Local Search, and Evolutionary Optimization. The optimization method used is defined by the optimization_method field in the json file. By default, HyperMapper will use Bayesian Optimization:

"optimization_method": "bayesian_optimization"

To use local search or evolutionary optimization instead, add to the json scenario file:

"optimization_method": "local_search"

or

"optimization_method": "evolutionary_optimization"

The interface for all optimization methods is identical, simply change the optimization_method field to change the optimization method used by HyperMapper. However, each method has additional optional hyperparameters that can be tuned using the json scenario file. The key hyperparameters of each optimization method are outlined below, a full description of all hyperparameters can be found here.

Bayesian Optimization

Bayesian Optimization is the default optimization method of HyperMapper. It is recommended when the goal is to minimize the black-box function with the least number of function evaluations. HyperMapper's Bayesian Optimization algorithm uses random scalarizations to handle any number of optimization objectives simultaneously, similar to the work of Paria et al.. The acquisition function and scalarization method used by HyperMapper during optimization can be controlled with the json file:

  • acquisition_function: ["UCB", "TS", "EI"]["EI"].
    The acquisition function that used during the Bayesian Optimization loop. Example:

    "acquisition_function": "EI"
  • scalarization_method: ["linear", "tchebyshev", "modified_tchebyshev"]["tchebyshev"].
    The scalarization method used for scalarizing objectives in a multi-objective setting. Linear and modified_tchebyshev are implemented as presented by Paria et al., while tchebyshev is implemented as presented by Knowles. Example:

    "scalarization_method": "tchebyshev"

It is possible to inject prior knowledge into the optimization to help HyperMapper converge faster. It is also possible to tell HyperMapper to focus its search on a specific portion of the Pareto front when in a multi-objective setting. See this example for details. Note that this feature is still under development.

Local Search

Local search is a lightweight optimization method, recommended when the black-box function is cheap to evaluate. HyperMapper implements a best-improvement multi-start local search algorithm. The algorithm randomly samples 20,000 points and starts best-improvement local searches on the best 20 points found. If the user provides a prior for the input space, half of the 20,000 random samples will be sampled from the prior. The initial number of random samples, the number of local searches started, and the number of maximum black-box function evaluations performed can be controlled in the json file:

  • local_search_starting_points: [integer][10].
    Number of starting points for the multi-start local search. Example:

    "local_search_starting_points": 10
  • local_search_random_points: [integer][10000].
    Number of random points sampled for the multi-start local search. HyperMapper will randomly sample double the number of points specified. Half of the points will be sampled from the prior, if provided. Example:

    "local_search_starting_points": 5000
  • local_search_evaluation_limit: [integer][-1][][-1].
    the maximum number of function evaluations the local search can perform. If -1, the number of function evaluations will not be limited. Example:

    "local_search_evaluation_limit": 5000
    

The local search can also be used in a multi-objective setting by defining scalarization weights for each objective in the json:

  • local_search_scalarization_weights: [array][float][1].
    weights to use in the scalarization of the optimization objectives. Must match the number of objectives. The sum of the weights should be 1, if it is not, HyperMapper will normalize them to 1. Example:
    "local_search_evaluation_limit": [0.3, 0.7]

Note: HyperMapper uses local search internally to optimize the acquisition function of the Bayesian Optimization method. Changing the values for the local search hyperparameters will also impact the Bayesian Optimization methods.

Evolutionary Optimization

Evolutionary optimization is another lightweight optimization method for black-box functions that are cheap to evaluate. HyperMapper implements a variant of the tournament selection evolutionary algorithm proposed by Real et al.. The algorithm randomly samples a number of configurations from the search space, then successively selects a random batch of these configurations to mutate for a number of generations. At each generation, the worst or the oldest configuration is removed from the population and a new child is added. The following parameters can be controlled in the json file:

  • evolution_population_size: [integer][50]. Number of points the Evolutionary Algorithm keeps track of. Example:

    "evolution_population_size": 100
  • evolution_generations: [integer][150]. Number of iterations through the evolutionary loop. Example:

    "evolution_generations": 200
  • mutation_rate: [integer][1]. Number of parameters to mutate in each generation. Example:

    "mutation_rate": 2
  • batch_size: [integer][2]. Number of samples to pick for tournament selection. If using crossover, must be at least three. Example:

    "batch_size": 3
  • evolution_crossover: [boolean][false]. Whether to use crossover. Example:

    "evolution_crossover": true
  • regularize_evolution: [boolean][false]. Whether to regularize (remove the oldest) the evolution. Example:

    "regularize_evolution": true
Clone this wiki locally