Skip to content

Conversation

nychiang
Copy link
Collaborator

@nychiang nychiang commented Aug 2, 2025

implement a logger using the code from https://docs.python.org/3/library/logging.html

CLOSE #717

@nychiang nychiang requested review from thartland and cnpetra August 22, 2025 17:06
@nychiang
Copy link
Collaborator Author

here is one example of the log:

hiopbbpy Problem name: LpNormProblem
hiopbbpy Max BO iter: 5
hiopbbpy Optimizing acquisition (LCB) with 10 random initial points
hiopbbpy Batch type: KB
hiopbbpy Batch size: 2
hiopbbpy Internal optimization solver: IPOPT
hiopbbpy Internal optimization solver options: {'max_iter': 200, 'print_level': 1}
hiopbbpy Initial training set: 5 samples, 2 dimensions
hiopbbpy Bounds: [(-5, 5), (-5, 5)]
hiopbbpy Logger level: info
hiopbbpy *****************************
hiopbbpy Iteration 1/5
hiopbbpy In batch 1/2
hiopbbpy Start finding the best sampling point:
hiopbbpy Acquisition optimization finished with 10 successes, 0 failures
hiopbbpy Acquisition values: min = -4.0166e-01, mean = -4.0166e-01, max = -4.0166e-01
hiopbbpy In batch 2/2
hiopbbpy Start finding the best sampling point:
hiopbbpy Acquisition optimization finished with 10 successes, 0 failures
hiopbbpy Acquisition values: min = 4.1435e-01, mean = 1.7229e+00, max = 2.5907e+00
hiopbbpy Training set size is now 7
hiopbbpy Current best objective: 1.8002e+00 (previous best: 2.3277e+00)
hiopbbpy Improvement: 5.2747e-01
hiopbbpy *****************************
...
hiopbbpy *****************************
hiopbbpy Iteration 5/5
hiopbbpy In batch 1/2
hiopbbpy Start finding the best sampling point:
hiopbbpy Acquisition optimization finished with 10 successes, 0 failures
hiopbbpy Acquisition values: min = 1.7660e+00, mean = 1.7971e+00, max = 1.8910e+00
hiopbbpy In batch 2/2
hiopbbpy Start finding the best sampling point:
hiopbbpy Acquisition optimization finished with 10 successes, 0 failures
hiopbbpy Acquisition values: min = 1.7675e+00, mean = 1.7763e+00, max = 1.8014e+00
hiopbbpy Training set size is now 15
hiopbbpy Current best objective: 1.7875e+00 (previous best: 1.7875e+00)
hiopbbpy Improvement: 0.0000e+00
hiopbbpy ===================================
hiopbbpy Bayesian Optimization completed
hiopbbpy Total evaluations: 10
hiopbbpy Optimal at BO iteration: 4
hiopbbpy Best value: [1.78751014]
hiopbbpy ===================================


y_train_virtual = y_train.copy() # old training + batch_size num of virtual points
for j in range(self.batch_size):
# Get a new sample point
self.logger.scalars(f"In batch {j+1}/{self.batch_size}")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not the batch number but rather the jth point of a given batch. In batch {j+1}/{self.batch_size} is misleading.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't get this point. j in range(self.batch_size) --> should j mean the batch number? in my example above, this line prints the lines of
hiopbbpy In batch 1/2
hiopbbpy In batch 2/2

self.logger.critical("===================================")
self.logger.critical("Bayesian Optimization completed")
self.logger.critical(f"Total evaluations: {len(self.y_hist)}")
self.logger.critical(f"Optimal at BO iteration: {self.idx_opt//self.batch_size+1} ")
Copy link
Collaborator

@thartland thartland Aug 22, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When the batch_size is $1$ I think it makes more sense to say "Optimal at BO iteration: {self.idx_opt}/{self.bo_maxiter}". When the batch_size is greater than $1$ I think we need to be more careful as a batch of points are added per BO iteration and it's more complicated. E.g, if batch size is $5$ then the first $5$ points in self.y_hist are obtained in the first BO iteration and so self.idx_opt being $2$ doesn't mean that the optimal point was obtained in the second BO iteration.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't get this.
if batch_size = 5, and idx_opt = 2, it means the optimal point was obtained in the 1st BO iteration, doesn't it?
The equation used here is a floor division, and hence {self.idx_opt//self.batch_size+1} = 2//5+1 = 0+1 = 1.

Copy link
Collaborator

@thartland thartland left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See inline comments below for requests of more descriptive logging messages. Overall great addition!

@nychiang nychiang marked this pull request as ready for review August 25, 2025 17:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

HiOpBB logging
2 participants