Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem using Gaussian Process when the objective function value is constant #59

Open
rzahra opened this issue Sep 20, 2021 · 4 comments

Comments

@rzahra
Copy link
Contributor

rzahra commented Sep 20, 2021

Hi,

I've tried to run your "branin" example and considering a constant objective value, like "y_value = 1" for each input parameter. I set up the "model" to "Gaussian Process".
"models": {
"model": "gaussian_process"
},

But, it does not work and I've got the following error. Would you please let me know what is the problem?

Best Regards,
Zahra

Traceback (most recent call last):
File "branin.py", line 39, in
main()
File "branin.py", line 34, in main
optimizer.optimize(parameters_file, branin_function)
File "/home/zahra/anaconda3/lib/python3.7/site-packages/hypermapper/optimizer.py", line 125, in optimize
config, black_box_function=black_box_function, profiling=profiling
File "/home/zahra/anaconda3/lib/python3.7/site-packages/hypermapper/bo.py", line 391, in main
objective_limits=objective_limits,
File "/home/zahra/anaconda3/lib/python3.7/site-packages/hypermapper/models.py", line 457, in generate_mono_output_regression_models
regressor[Ycol].optimize()
File "/home/zahra/anaconda3/lib/python3.7/site-packages/GPy/core/gp.py", line 659, in optimize
ret = super(GP, self).optimize(optimizer, start, messages, max_iters, ipython_notebook, clear_after_finish, **kwargs)
File "/home/zahra/anaconda3/lib/python3.7/site-packages/paramz/model.py", line 111, in optimize
opt.run(start, f_fp=self._objective_grads, f=self._objective, fp=self._grads)
File "/home/zahra/anaconda3/lib/python3.7/site-packages/paramz/optimization/optimization.py", line 51, in run
self.opt(x_init, **kwargs)
File "/home/zahra/anaconda3/lib/python3.7/site-packages/paramz/optimization/optimization.py", line 124, in opt
opt_result = optimize.fmin_l_bfgs_b(f_fp, x_init, maxfun=self.max_iters, maxiter=self.max_iters, **opt_dict)
File "/home/zahra/anaconda3/lib/python3.7/site-packages/scipy/optimize/lbfgsb.py", line 199, in fmin_l_bfgs_b
**opts)
File "/home/zahra/anaconda3/lib/python3.7/site-packages/scipy/optimize/lbfgsb.py", line 345, in _minimize_lbfgsb
f, g = func_and_grad(x)
File "/home/zahra/anaconda3/lib/python3.7/site-packages/scipy/optimize/lbfgsb.py", line 295, in func_and_grad
f = fun(x, args)
File "/home/zahra/anaconda3/lib/python3.7/site-packages/scipy/optimize/optimize.py", line 327, in function_wrapper
return function(
(wrapper_args + args))
File "/home/zahra/anaconda3/lib/python3.7/site-packages/scipy/optimize/optimize.py", line 65, in call
fg = self.fun(x, *args)
File "/home/zahra/anaconda3/lib/python3.7/site-packages/paramz/model.py", line 273, in _objective_grads
self.optimizer_array = x
File "/home/zahra/anaconda3/lib/python3.7/site-packages/paramz/parameterized.py", line 339, in setattr
return object.setattr(self, name, val)
File "/home/zahra/anaconda3/lib/python3.7/site-packages/paramz/core/parameter_core.py", line 124, in optimizer_array
self.trigger_update()
File "/home/zahra/anaconda3/lib/python3.7/site-packages/paramz/core/updateable.py", line 79, in trigger_update
self._trigger_params_changed(trigger_parent)
File "/home/zahra/anaconda3/lib/python3.7/site-packages/paramz/core/parameter_core.py", line 134, in _trigger_params_changed
self.notify_observers(None, None if trigger_parent else -np.inf)
File "/home/zahra/anaconda3/lib/python3.7/site-packages/paramz/core/observable.py", line 91, in notify_observers
[callble(self, which=which) for _, _, callble in self.observers]
File "/home/zahra/anaconda3/lib/python3.7/site-packages/paramz/core/observable.py", line 91, in
[callble(self, which=which) for _, _, callble in self.observers]
File "/home/zahra/anaconda3/lib/python3.7/site-packages/paramz/core/parameter_core.py", line 508, in _parameters_changed_notification
self.parameters_changed()
File "/home/zahra/anaconda3/lib/python3.7/site-packages/GPy/core/gp.py", line 267, in parameters_changed
self.posterior, self._log_marginal_likelihood, self.grad_dict = self.inference_method.inference(self.kern, self.X, self.likelihood, self.Y_normalized, self.mean_function, self.Y_metadata)
File "/home/zahra/anaconda3/lib/python3.7/site-packages/GPy/inference/latent_function_inference/exact_gaussian_inference.py", line 58, in inference
Wi, LW, LWi, W_logdet = pdinv(Ky)
File "/home/zahra/anaconda3/lib/python3.7/site-packages/GPy/util/linalg.py", line 207, in pdinv
L = jitchol(A, *args)
File "/home/zahra/anaconda3/lib/python3.7/site-packages/GPy/util/linalg.py", line 75, in jitchol
raise linalg.LinAlgError("not positive definite, even with jitter.")
numpy.linalg.LinAlgError: not positive definite, even with jitter."

@ksehic
Copy link
Collaborator

ksehic commented Sep 20, 2021

@rzahra Can you copy-paste the py and json file of the Branin function that you are using here? As I was running myself with y_value=1 and GP, I did not get the error. In general, using a GP for a constant function is not smart but it should work...

@rzahra
Copy link
Contributor Author

rzahra commented Sep 20, 2021

#!/usr/bin/python
import math

import os
import sys
import warnings
from collections import OrderedDict

from hypermapper import optimizer # noqa

def branin_function(X):
"""
Compute the branin function.
:param X: dictionary containing the input points.
:return: the value of the branin function
"""
x1 = X["x1"]
x2 = X["x2"]
a = 1.0
b = 5.1 / (4.0 * math.pi * math.pi)
c = 5.0 / math.pi
r = 6.0
s = 10.0
t = 1.0 / (8.0 * math.pi)

#y_value = a * (x2 - b * x1 * x1 + c * x1 - r) ** 2 + s * (1 - t) * math.cos(x1) + s
y_value = 1
return y_value

def main():
parameters_file = "/home/zahra/Desktop/hypermapper-simple-test/example_scenarios/quick_start/branin_scenario.json"
optimizer.optimize(parameters_file, branin_function)
print("End of Branin.")

if name == "main":
main()

@rzahra
Copy link
Contributor Author

rzahra commented Sep 20, 2021

{
"application_name": "branin",
"optimization_objectives": ["Value"],
"optimization_iterations": 20,
"optimization_method": "bayesian_optimization",
"acquisition_function_optimizer": "local_search",
"design_of_experiment": {
"doe_type": "random sampling",
"number_of_samples": 3

 },

 "models": {
    "model": "gaussian_process"
},


"input_parameters" : {
    "x1": {
        "parameter_type" : "real",
        "values" : [-5, 10],
        "parameter_default" : 0
    },
    "x2": {
        "parameter_type" : "real",
        "values" : [0, 15],
        "parameter_default" : 0
    }
}

}

@ksehic
Copy link
Collaborator

ksehic commented Sep 20, 2021

@rzahra Strange. I didn't get the error. I was using the latest version of Hypermapper from this repo. Pip version is not yet updated as I can see... Can you maybe check that you are using the latest version of Hypermapper or check is there any mismatch between anaconda3 and Hypermapper

My result...

x1 x2 Value Timestamp
0 0 1 1
5.60301701755873 6.04807923864642 1 1
-3.24438823881358 12.702020869894 1 1
2.19105319236382 2.32721787344419 1 368
1.18058256785225 0.280883034597277 1 816
3.77255434050274 1.29103718689401 1 1267
7.45611027312549 0.329748493671475 1 1706
10 0.396969533465726 1 2113
1.51087515746304 1.89817942409036 1 2500
8.33121229115913 1.29476803461852 1 2949
5.098804488615 1.87438181073141 1 3388
3.51543678199592 1.7371702889608 1 3824
3.12219508823726 1.00400351211924 1 4260
3.88174205850473 1.53863041723447 1 4736
8.54319783549642 0.8461187299401 1 5171
7.85486273117496 1.34970757932602 1 5637
8.57199759865176 0.755149316832652 1 6061
-1.64473966669573 0.213732575999534 1 6544
8.3284077498699 3.34379278975797 1 6618
8.26472155895912 0.574587731483283 1 7075
6.74071982162813 2.10840547806442 1 7528
6.5380851578267 0.668962647053504 1 8053
-2.16236399703512 0.712652098237651 1 8510

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants