Skip to content

Batch Branin

arturluis edited this page Feb 17, 2021 · 5 revisions

In this example, we show how to use HyperMapper's batch function evaluations in the optimization of the Branin function. We look for minimizing the value of this function given two parameters x1, x2, and at each iteration, we will evaluate two configurations.

The Objective Function

In order to leverage HyperMapper's batch evaluations, we must implement a python method that receives several configurations to evaluate as input and returns the value of all configurations as output:

def branin_function(X):
    values = []
    for idx in range(len(X['x1'])):
        x1 = X['x1'][idx]
        x2 = X['x2'][idx]
        a = 1.0
        b = 5.1 / (4.0 * math.pi * math.pi)
        c = 5.0 / math.pi
        r = 6.0
        s = 10.0
        t = 1.0 / (8.0 * math.pi)

        y_value = a * (x2 - b * x1 * x1 + c * x1 - r) ** 2 + s * (1 - t) * math.cos(x1) + s
        values.append(y_value)

    return values

Note that the input to this method is a dictionary of lists. The keys of this dictionary are the input parameters of our method and each key in the dictionary holds a list of values, which are the values of that input parameter for each configuration. Likewise, our method returns a list of values, containing the value of the Branin for each configuration.

Note that the method evaluates any number of configurations, this is important as HyperMapper will request evaluation of all configurations from the DoE phase at once if we are using batch evaluations. An implementation of this method can be found here

The JSON Configuration File

Batch evaluations are triggered in the json using the evaluations_per_optimization_iteration field. This field tells HyperMapper the number of configurations that should be evaluated in each batch:

{
    "application_name": "batch_branin",
    "optimization_objectives": ["Value"],
    "optimization_iterations": 10,
    "evaluations_per_optimization_iteration": 2,
    "input_parameters" : {
        "x1": {
            "parameter_type" : "real",
            "values" : [-5, 10]
        },
        "x2": {
            "parameter_type" : "real",
            "values" : [0, 15]
        }
    }
}

In this example, we evaluate two configurations at each optimization iteration. You can find this json in batch_branin_scenario.json.

Run HyperMapper

In order to run this example, we use:

python3 example_scenarios/synthetic/batch_branin/batch_branin.py

An example of stdout output can be found here.

The result of this script is a csv file called batch_branin_output_samples.csv.

Extending to the Multi-objective Case

If the application has multiple objectives, the output of the python method has to be a dictionary of lists. The keys of this dictionary must be the optimization objectives, including the feasibility constraint if present. Each key must hold a list of value of size evaluations_per_optimization_iteration, containing the objective values for each configuration. For example, using the chakong-haimes function:

def chakong_haimes(X):
    outputs = {}
    outputs['f1_value'] = []
    outputs['f2_value'] = []
    outputs['Valid'] = []
    for idx in range(len(X['x1'])):
        x1 = X['x1'][idx]
        x2 = X['x2'][idx]
        # check constraints
        g1 = x1*x1 + x2*x2 <= 225
        g2 = x1 - 3*x2 + 10 <= 0
        valid = g1 and g2

        outputs = {}
        outputs['f1_value'].append(f1_value)
        outputs['f2_value'].append(f2_value)
        outputs['Valid'].append(valid)

    return outputs
Clone this wiki locally