fig.show()

# %% [markdown]
# ## Sample the observer over the search space
#
# Sometimes we don't have direct access to the objective function. We only have an observer that indirectly observes it. In _Trieste_, an observer can output a number of datasets. In our case, we only have one dataset, the objective. We can convert a function with `branin`'s signature to a single-output observer using `mk_observer`.
#
# The optimization procedure will benefit from having some starting data from the objective function to base its search on. We sample a five point space-filling design from the search space and evaluate it with the observer. For continuous search spaces, Trieste supports random, Sobol and Halton initial designs.

# %%
import trieste

observer = trieste.objectives.utils.mk_observer(scaled_branin)

num_initial_points = 5
initial_query_points = search_space.sample_sobol(num_initial_points)
initial_data = observer(initial_query_points)

# %% [markdown]
# ## Model the objective function
#
# The Bayesian optimization procedure estimates the next best points to query by using a probabilistic model of the objective. We'll use Gaussian Process (GP) regression for this, as provided by GPflow. The model will need to be trained on each step as more points are evaluated, so we'll package it with GPflow's Scipy optimizer.
#
# We put priors on the parameters of our GP model's kernel in order to stabilize model fitting. We found the priors below to be highly effective for objective functions defined over the unit hypercube and with an ouput standardized to have zero mean and unit variance. For objective functions with different scaling, other priors will likely be more appropriate. Our fitted model uses the maximum a posteriori estiamte of these kernel parameters, as found by optimizing the kernel parameters starting from the best of `num_kernel_samples` random samples from the kernel parameter priors.
#
# If we do not specify kernel priors, then Trieste returns the maximum likelihood estimate of the kernel parameters.

# %%
import gpflow
import tensorflow_probability as tfp
from trieste.models.gpflow import GPflowModelConfig
Example #2
0
def test_box_sobol_sampling_returns_different_points_for_different_call() -> None:
    box = Box(tf.zeros((3,)), tf.ones((3,)))
    sobol_samples_1 = box.sample_sobol(num_samples=100)
    sobol_samples_2 = box.sample_sobol(num_samples=100)
    npt.assert_raises(AssertionError, npt.assert_allclose, sobol_samples_1, sobol_samples_2)
Example #3
0
def test_box_sobol_sampling_raises_for_invalid_sample_size(num_samples: int) -> None:
    with pytest.raises(TF_DEBUGGING_ERROR_TYPES):
        box = Box(tf.zeros((3,)), tf.ones((3,)))
        box.sample_sobol(num_samples)
Example #4
0
def test_box_sobol_sampling_returns_same_points_for_same_skip(skip: int) -> None:
    box = Box(tf.zeros((3,)), tf.ones((3,)))
    sobol_samples_1 = box.sample_sobol(num_samples=100, skip=skip)
    sobol_samples_2 = box.sample_sobol(num_samples=100, skip=skip)
    npt.assert_allclose(sobol_samples_1, sobol_samples_2)
Example #5
0
def test_box_sobol_sampling_returns_correct_shape(num_samples: int) -> None:
    box = Box(tf.zeros((3,)), tf.ones((3,)))
    sobol_samples = box.sample_sobol(num_samples)
    _assert_correct_number_of_unique_constrained_samples(num_samples, box, sobol_samples)