Skip to content

Commit

Permalink
Merge: Add Continuous Optimization Controls to BotorchRecommender (#389)
Browse files Browse the repository at this point in the history
BotorchRecommender now has the optional keywords `n_restarts` and
`n_raw_samples` which control continuous optimization behavior and can
be increased if the user desires. The defaults have been slighlty
increased as well as previous values seemed extremely low.
  • Loading branch information
Scienfitz authored Oct 11, 2024
2 parents 4a92743 + 3f5f48f commit 573664b
Show file tree
Hide file tree
Showing 3 changed files with 35 additions and 15 deletions.
6 changes: 5 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,10 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [Unreleased]
### Added
- `n_restarts` and `n_raw_samples` keywords to configure continuous optimization
behavior for `BotorchRecommender`

### Fixed
- Leftover attrs-decorated classes are garbage collected before the subclass tree is
traversed, avoiding sporadic serialization problems
Expand All @@ -23,7 +27,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
- Leftover attrs-decorated classes are garbage collected before the subclass tree is
traversed, avoiding sporadic serialization problems

### Deprecated
### Deprecations
- `ContinuousLinearEqualityConstraint` and `ContinuousLinearInequalityConstraint`
replaced by `ContinuousLinearConstraint` with the corresponding `operator` keyword

Expand Down
27 changes: 19 additions & 8 deletions baybe/recommenders/pure/bayesian/botorch.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,9 @@
from typing import Any, ClassVar

import pandas as pd
from attr.converters import optional
from attrs import define, field
from attrs.converters import optional as optional_c
from attrs.validators import gt, instance_of

from baybe.acquisition.acqfs import qThompsonSampling
from baybe.exceptions import (
Expand Down Expand Up @@ -50,12 +51,12 @@ class BotorchRecommender(BayesianRecommender):
# Object variables
sequential_continuous: bool = field(default=False)
"""Flag defining whether to apply sequential greedy or batch optimization in
**continuous** search spaces. (In discrete/hybrid spaces, sequential greedy
optimization is applied automatically.)
**continuous** search spaces. In discrete/hybrid spaces, sequential greedy
optimization is applied automatically.
"""

hybrid_sampler: DiscreteSamplingMethod | None = field(
converter=optional(DiscreteSamplingMethod), default=None
converter=optional_c(DiscreteSamplingMethod), default=None
)
"""Strategy used for sampling the discrete subspace when performing hybrid search
space optimization."""
Expand All @@ -64,6 +65,16 @@ class BotorchRecommender(BayesianRecommender):
"""Percentage of discrete search space that is sampled when performing hybrid search
space optimization. Ignored when ``hybrid_sampler="None"``."""

n_restarts: int = field(validator=[instance_of(int), gt(0)], default=10)
"""Number of times gradient-based optimization is restarted from different initial
points. **Does not affect purely discrete optimization**.
"""

n_raw_samples: int = field(validator=[instance_of(int), gt(0)], default=64)
"""Number of raw samples drawn for the initialization heuristic in gradient-based
optimization. **Does not affect purely discrete optimization**.
"""

@sampling_percentage.validator
def _validate_percentage( # noqa: DOC101, DOC103
self, _: Any, value: float
Expand Down Expand Up @@ -168,8 +179,8 @@ def _recommend_continuous(
acq_function=self._botorch_acqf,
bounds=torch.from_numpy(subspace_continuous.comp_rep_bounds.values),
q=batch_size,
num_restarts=5, # TODO make choice for num_restarts
raw_samples=10, # TODO make choice for raw_samples
num_restarts=self.n_restarts,
raw_samples=self.n_raw_samples,
equality_constraints=[
c.to_botorch(subspace_continuous.parameters)
for c in subspace_continuous.constraints_lin_eq
Expand Down Expand Up @@ -252,8 +263,8 @@ def _recommend_hybrid(
acq_function=self._botorch_acqf,
bounds=torch.from_numpy(searchspace.comp_rep_bounds.values),
q=batch_size,
num_restarts=5, # TODO make choice for num_restarts
raw_samples=10, # TODO make choice for raw_samples
num_restarts=self.n_restarts,
raw_samples=self.n_raw_samples,
fixed_features_list=fixed_features_list,
equality_constraints=[
c.to_botorch(
Expand Down
17 changes: 11 additions & 6 deletions docs/userguide/recommenders.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,16 +38,21 @@ for various acquisition functions.
* The **[`BotorchRecommender`](baybe.recommenders.pure.bayesian.botorch.BotorchRecommender)**
is a powerful recommender based on BoTorch's optimization engine that can be applied
to all kinds of search spaces. In continuous spaces, its `sequential_continuous` flag
allows to chose between greedy sequential optimization and batch optimization as the
allows to choose between greedy sequential optimization and batch optimization as the
underlying point generation mode. In discrete/hybrid spaces, sequential greedy
selection is the only available mode and is thus activated automatically.

Note that the recommender performs a brute-force search when applied to hybrid search
spaces, as it optimizes the continuous part of the space while exhaustively searching
choices in the discrete subspace. You can customize this behavior to only sample a
certain percentage of the discrete subspace via the `sample_percentage` attribute and
to choose different sampling algorithms via the `hybrid_sampler` attribute.

spaces, as it does gradient-based optimization in the continuous part of the space
while exhaustively evaluating configurations of the discrete subspace. You can customize this
behavior to only sample a certain percentage of the discrete subspace via the
`sample_percentage` attribute and to choose different sampling algorithms via the
`hybrid_sampler` attribute.

The gradient-based optimization part can also further be controlled by the
`n_restarts` and `n_raw_samples` keywords. For details, please refer
to [BotorchRecommender](baybe.recommenders.pure.bayesian.botorch.BotorchRecommender).

An example on using this recommender in a hybrid space can be found
[here](./../../examples/Backtesting/hybrid).

Expand Down

0 comments on commit 573664b

Please sign in to comment.