Releases: secondmind-labs/trieste
Release 0.10.0
New functionality
BALD active learning acquisition function (#417)
Continuous Thompson Sampling acquisition functions (#475, #480, #486, #500)
Random Sampling acquisition function (#493)
Support for Keras models and trajectory samplers (#459, #467, #468)
Utilities for quickly constructing GPFlow models (#465, #483)
Improvements
Support for SVGP and VGP models with GIBBON (#491)
Support for covariance_between_points with multi-output GPR/SVGP/VGP models (#492)
Support splitting up acquisition function calls to reduce memory usage (#497)
Improve tensorboard logging to handle gpflux models, ask-tell optimization and wallclock timings (#469, #470, #488)
Improve static type checking for rules and samplers that depend on specific types of models (#463, #466, #474, #479, #482, #499, #501)
Build Changes
OpenAI Gym Lunar Lander tutorial (#456)
Support and test with both TF 2.4 and TF 2.5 (#484, #490)
Simplify optimizer code (#496)
Full Changelog: v0.9.1...v0.10.0
Release 0.9.1
This point release temporarily reverts the GPFlux RFF fix (#420) so as to maintain support for Tensorflow 2.4. It also adds the following functionality.
New functionality
Support for vectorized acquisition functions (#458)
Improvements
Fix TF compilation issue for VGP models (#418)
Full Changelog: v0.9.0...v0.9.1
Release 0.9.0
New functionality
t-IMSE acquisition functions (#426, #429)
Kriging Believer acquisition functions (#426, #428, #451)
Initial support for Keras (#451, #452)
Improvements
Refactor model-sampler interactions (#398)
Parallel acquisition function optimizers (#438)
Fix GPflux RandomFourierFeatures import (#420)
Use default optimizers with configs (#434)
Make AcquisitionFunctionBuilder generic on ProbabilisticModel (#433)
Build Changes
Notebook formatting (#432)
Fix test random number seeding (#450)
Active learning integration tests (#441)
Breaking Changes
ModelStack renamed to TrainableModelStack
LocalPenalizationAcquisitionFunction renamed to LocalPenalization
trieste.acquisition.function.local_penalisation renamed to greedy_batch
Full Changelog: v0.8.0...v0.9.0
Release 0.8.0
New functionality
Support for deep Gaussian processes with GPflux (#357, #364, #377)
Support for asynchronous Bayesian Optimization (#366, #374, #380, #381, #384, #386)
Active learning: predictive variance (#294) and expected feasibility (#421) acquisition functions
Tagged product search spaces (#367, #387, #403, #422)
Tensorboard monitoring support (#370, #407)
Trid (#378) and simpe quadratic (#404) objective functions
Improvements
Make datasets an optional keyword argument for rule acquisition and acquisition function preparation (#383)
Split up function.py (interfaces must now be imported from trieste.acquisition or trieste.acquisition.interfaces) (#408)
Improve config handing (support dictionary configs again; replace create_optimizer by ModelRegistry; add tutorial) (#389)
Allow empty observation for non-dominated space partitions (#356)
Allow specification of scipy optimizer kwargs for optimizing acqusition functions (#410)
Refactor model optimizers (TFOptimizer renamed to BatchOptimizer) (#372, #405)
Build changes
Speed up CI tests (#377, #390, #391, #395, #399, #404, #409)
Improved documentation (#382, #400 and various above)
Full Changelog: v0.7.0...v0.8.0
Release 0.7.0
New functionality
Ask Tell API (#346)
GPFlux interface (but no models yet) (#355)
Michalewicz function (#350)
Improvements
Support in-place updates to acquisition functions to avoid having to retrace every acquisition loop. Update existing acquisition function builders to use this. (#271, #327, #340, #349, #352)
Fix SVGP interface to be consistent with other GPflow interfaces (#320)
Refactor Pareto code. Note that hypervolume acquisition function builders are now passed partition bounds. (#328)
Simplify trust region handling (#306)
Build changes
Split model interfaces into directories (#272)
Rename trieste.type module to trieste.types (#323)
Remove homespun deepcopy functionality (#339)
Improve type checking (#307, #331, #333)
Use extend-exclude for flake8 and black (#348)
Reduce RAM usage in integration tests (#330)
Release 0.6.1
This point release updates trieste.space.Box
to support empty boxes. It adds no new features.
Release 0.6.0
New functionality
New acquisition functions:
- AugmentedExpectedImprovement (#265)
- GIBBON (#275)
- ExpectedConstrainedHypervolumeImprovement (#285)
- BatchMonteCarloExpectedHypervolumeImprovement (#257)
New samplers:
Improvements
Better model fitting:
- GPR kernel initialization (#277)
- BayesianOptimizer initial model fit (#283)
- Support model-specific optimization parameters (#287)
- Including kernel prior term in the likelihood when choosing kernel params (#290, #291)
- Sample from constrained kernel parameters before model fitting (#297, #303, #305)
Better acquisition optimization:
- Better error handling in continuous acquisition optimizer (#289, #313)
- Better continuous optimizers with L-BFGS-B support (#276) and recovery restarts (#313)
Experimental design support for continuous search spaces through Sobol/Halton (#259)
ExpectedConstrainedImprovement efficiency improvement (#284)
Better handling of tf.function (#299, #309)
Objective functions moved to a separate package, added search space variables (#302)
Better numerical stability in GIBBON/MES (#310)
Build changes
More notebook documentation (#280, #288, #310)
Improved instructions for contributions and discussions (#301)
Release 0.5.1
This point release updates the GPflow dependency to version 2.2. It adds no new features.
Release 0.5.0
New functionality
add support for multi-objective optimization with the expected hypervolume improvement acquisition function (#177) (#194) (#202) (#207) (#217) (#225) (#243)
add support for batch optimization via local penalization (#230) (#251)
allow custom acquisition function optimizers (#186)
add various toy objective functions: Gramacy & Lee (#168), Goldstein-Price (#169), VLMOP2, DTLZ (#190), Hartmann (#204), Rosenbrock, Ackley (#241), Shekel (#250)
Improvements
simplify single model/dataset use case (#252)
expose predict_y from GPFlow models (#254)
support arbitrary tensor-likes as inputs, not just lists (#234)
improve and track unit test code coverage (#222) (#236)
Build changes
simplify docs build and add it to build checks (#231) (#240)
add taskipy support for running tests (#219) (#244)
Release 0.4.0
New functionality
add Monte-Carlo-based sampler for joint distributions, using reparametrization trick (#93)
add Monte-Carlo-based batch Expected Improvement acquisition function (#133)
add tutorials for batch-sequential acquisition functions (#149) (#151)
add predict_joint
method to root model interface ProbabilisticModel
for predicting the mean and variance of joint distributions (#93)
support lists as lower and upper bound arguments to Box
(#112)
add py.typed so that trieste type hints can be used by client code (#140)
add efficient astuple
conversion method on Dataset
(#106)
add support for optimizing all GPflow model wrappers with either tf.optimizers.Optimizer
s (with or without mini-batching) or gpflow.optimizers.Scipy
(#47)
Improvements
significant refactor of BayesianOptimizer
return type, to reduce the chance of working with the result of incomplete BO runs (#17)
merge equivalent tensor type aliases (those in type
module) (#76)
deepcopying is optimized on types typically copied while tracking state in BayesianOptimizer
(#104)
fix type inconsistency in VariationalGaussianProcess
's constructor (#116)
Build changes
various improvements to documentation site, including "how-to" section in tutorials (#63) and formatting for bibtex references (#110)
add flake8 code linter (#109) and isort import organiser (#107) to build checks
add missing build dependencies to pyproject.toml (#141)