Skip to content
forked from evhub/bbopt

The easiest hyperparameter optimization you'll ever do.

License

Notifications You must be signed in to change notification settings

gitter-badger/bbopt

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

BBopt

BBopt aims to provide the easiest hyperparameter optimization you'll ever do. Think of BBopt like Keras for black box optimization: one universal interface for working with any black box optimization backend.

BBopt's features include:

  • a universal API for defining your tunable parameters based on the standard library random module (so you don't even have to learn anything new!),
  • tons of state-of-the-art black box optimization algorithms such as Gaussian Processes from scikit-optimize or Tree Structured Parzen Estimation from hyperopt for tuning parameters,
  • the ability to switch algorithms (even across different backends!) while retaining all previous trials,
  • multiprocessing-safe data saving to enable running multiple trials in parallel,
  • support for optimizing over conditional parameters that only appear during some runs,
  • support for all major Python versions (2.7 or 3.4+), and
  • a straightforward interface for extending BBopt with your own custom algorithms.

Once you've defined your parameters, training a black box optimization model on those parameters is as simple as

bbopt your_file.py

and serving your file with optimized parameters is as simple as

import your_file

Installation

To get going with BBopt, just install it with

pip install bbopt

or, to also install the extra dependencies necessary for running BBopt's examples, run pip install bbopt[examples].

Basic Usage

To use bbopt, just add

# BBopt setup:
from bbopt import BlackBoxOptimizer
bb = BlackBoxOptimizer(file=__file__)
if __name__ == "__main__":
    bb.run()

to the top of your file, then call

x = bb.uniform("x", 0, 1)

for each of the tunable parameters in your model, and finally add

bb.maximize(y)      or      bb.minimize(y)

to set the value being optimized. Then, run

bbopt <your file here> -n <number of trials> -j <number of processes>

to train your model, and just

import <your module here>

to serve it!

Examples

Some examples of BBopt in action:

  • random_example.py: Extremely basic example using the random backend.
  • skopt_example.py: Slightly more complex example making use of the gaussian_process algorithm from the scikit-optimize backend.
  • hyperopt_example.py: Example showcasing the tree_structured_parzen_estimator algorithm from the hyperopt backend.
  • numpy_example.py: Example which showcases how to have numpy array parameters.
  • conditional_skopt_example.py: Example of having black box parameters that are dependent on other black box parameters using the gaussian_process algorithm from the scikit-optimize backend.
  • conditional_hyperopt_example.py: Example of doing conditional parameters with the tree_structured_parzen_estimator algorithm from the hyperopt backend.
  • keras_example.py: Complete example of using BBopt to optimize a neural network built with Keras. Uses the full API to implement its own optimization loop and thus avoid the overhead of running the entire file multiple times.
  • mixture_example.py: Example of using the mixture backend to randomly switch between different algorithms.
  • json_example.py: Example of using json instead of pickle to save parameters.

Full API

  1. Command-Line Interface
  2. Black Box Optimization Methods
    1. Constructor
    2. run
    3. algs
    4. run_backend
    5. minimize
    6. maximize
    7. remember
    8. get_current_run
    9. get_optimal_run
    10. get_data
    11. data_file
    12. backend
  3. Parameter Definition Methods
    1. randrange
    2. randint
    3. getrandbits
    4. choice
    5. randbool
    6. shuffle
    7. sample
    8. random
    9. uniform
    10. loguniform
    11. normalvariate
    12. lognormvariate
    13. rand
    14. randn
  4. Writing Your Own Backend

Command-Line Interface

The bbopt command is extremely simple in terms of what it actually does. For the command bbopt <file> -n <trials> -j <processes>, BBopt simply runs python <file> a number of times equal to <trials>, split across <processes> different processes.

Why does this work? If you're using the basic boilerplate, then running python <file> will trigger the if __name__ == "__main__": clause, which will run a training episode. But when you go to import your file, the if __name__ == "__main__": clause won't get triggered, and you'll just get served the best parameters found so far. Since the command-line interface is so simple, advanced users who want to use the full API instead of the boilerplate need not use the bbopt command at all. If you want more information on the bbopt command, just run bbopt -h.

Black Box Optimization Methods

Constructor

BlackBoxOptimizer(file, protocol=None)

Create a new bb object; this should be done at the beginning of your program as all the other functions are methods of this object.

file is used by BBopt to figure out where to load and save data to, and should usually just be set to __file__ (BBopt uses os.path.splitext(file)[0] as the base path for the data file).

protocol determines how BBopt serializes data. If None (the default), BBopt will use pickle protocol 2, which is the highest version that works on both Python 2 and Python 3 (unless a json file is present, in which case BBopt will use json). To use the newest protocol instead, pass protocol=-1. If protocol="json", BBopt will use json instead of pickle, which is occasionally useful for cross-platform compatibility.

run

BlackBoxOptimizer.run(alg="tree_structured_parzen_estimator")

Start optimizing using the given black box optimization algorithm. Use algs to get the valid values for alg.

If this method is never called, or called with alg=None, BBopt will just serve the best parameters found so far, which is how the basic boilerplate works. Note that, if no saved parameter data is found, and a guess is present, BBopt will use that, which is a good way of distributing your parameter values without including all your saved parameter data.

algs

BlackBoxOptimizer.algs

A dictionary mapping the valid algorithms for use in run to the pair (backend, kwargs) of the backend and arguments to that backend that the algorithm corresponds to.

Supported algorithms are:

  • "serving" (or None) (serving backend),
  • "random" (random backend),
  • "tree_structured_parzen_estimator" (hyperopt backend) (the default),
  • "annealing" (hyperopt backend),
  • "gaussian_process" (scikit-optimize backend),
  • "random_forest" (scikit-optimize backend),
  • "extra_trees" (scikit-optimize backend), and
  • "gradient_boosted_regression_trees" (scikit-optimize backend).

run_backend

BlackBoxOptimizer.run_backend(backend, *args, **kwargs)

The base function behind run. Instead of specifying an algorithm, run_backend lets you specify the specific backend you want to call and the parameters you want to call it with. Different backends do different things with the remaining arguments:

  • scikit-optimize passes the arguments to skopt.Optimizer,
  • hyperopt passes the arguments to fmin, and
  • mixture expects a distribution argument to specify the mixture of different algorithms to use, specifically a list of (alg, weight) tuples.

minimize

BlackBoxOptimizer.minimize(value)

Finish optimizing and set the loss for this run to value. To start another run, call run again.

maximize

BlackBoxOptimizer.maximize(value)

Same as minimize but sets the gain instead of the loss.

remember

BlackBoxOptimizer.remember(info)

Update the current run's "memo" field with the given info dictionary. Useful for saving information about a run that shouldn't actually impact optimization but that you would like to have access to later (using get_optimal_run, for example).

get_current_run

BlackBoxOptimizer.get_current_run()

Get information on the current run, including the values of all parameters encountered so far and the loss/gain of the run if specified yet.

get_optimal_run

BlackBoxOptimizer.get_optimal_run()

Get information on the best run so far. These are the parameters that will be used if run is not called.

get_data

BlackBoxOptimizer.get_data()

Dump a dictionary containing all the information on your program collected by BBopt.

data_file

BlackBoxOptimizer.data_file

The path of the file where BBopt is saving data to.

backend

BlackBoxOptimizer.backend

The backend object being used by the current BlackBoxOptimizer instance.

Parameter Definition Methods

Every BBopt parameter definition method has the form

bb.<random function>(<name>, <args>, **kwargs)

where

  • the method itself specifies what distribution is being modeled,
  • the first argument is always name, a unique string identifying that parameter,
  • following name are whatever arguments are needed to specify the distribution's parameters, and
  • at the end are keyword arguments, which are the same for all the different methods. The supported kwargs are:
    • guess, which specifies the initial value for the parameter, and
    • placeholder_when_missing, which specifies what placeholder value a conditional parameter should be given if missing.

Important note: Once you bind a name to a parameter you cannot change that parameter's options. Thus, if the options defining your parameters can vary from run to run, you must use a different name for each possible combination.

randrange

BlackBoxOptimizer.randrange(name, stop, **kwargs)

BlackBoxOptimizer.randrange(name, start, stop, step=1, **kwargs)

Create a new parameter modeled by random.randrange(start, stop, step), which is equivalent to random.choice(range(start, stop, step)), but can be much more efficient.

Backends which support randrange: scikit-optimize, hyperopt, random.

randint

BlackBoxOptimizer.randint(name, a, b, **kwargs)

Create a new parameter modeled by random.randint(a, b), which is equivalent to random.randrange(a, b-1).

Backends which support randint: scikit-optimize, hyperopt, random.

getrandbits

BlackBoxOptimizer.getrandbits(name, k, **kwargs)

Create a new parameter modeled by random.getrandbits(k), which is equivalent to random.randrange(0, 2**k).

Backends which support getrandbits: scikit-optimize, hyperopt, random.

choice

BlackBoxOptimizer.choice(name, seq, **kwargs)

Create a new parameter modeled by random.choice(seq), which chooses an element from seq.

Backends which support choice: scikit-optimize, hyperopt, random.

randbool

BlackBoxOptimizer.randbool(name, **kwargs)

Create a new boolean parameter, modeled by the equivalent of random.choice([True, False]).

Backends which support randbool: scikit-optimize, hyperopt, random.

shuffle

BlackBoxOptimizer.shuffle(name, x, **kwargs)

Create a new parameter modeled by random.shuffle(population, k), except that it returns the shuffled list instead of shuffling it in place.

Backends which support shuffle: scikit-optimize, hyperopt, random.

sample

BlackBoxOptimizer.sample(name, population, k, **kwargs)

Create a new parameter modeled by random.sample(population, k), which chooses k elements from population.

Backends which support sample: scikit-optimize, hyperopt, random.

random

BlackBoxOptimizer.random(name, **kwargs)

Create a new parameter modeled by random.random(), which is equivalent to random.uniform(0, 1).

Backends which support random: scikit-optimize, hyperopt, random.

uniform

BlackBoxOptimizer.uniform(name, a, b, **kwargs)

Create a new parameter modeled by random.uniform(a, b), which uniformly selects a float between a and b.

Backends which support uniform: scikit-optimize, hyperopt, random.

loguniform

BlackBoxOptimizer.loguniform(name, min_val, max_val, **kwargs)

Create a new parameter modeled by

math.exp(random.uniform(math.log(min_val), math.log(max_val)))

which logarithmically selects a float between min_val and max_val.

Backends which support loguniform: scikit-optimize, hyperopt, random.

normalvariate

BlackBoxOptimizer.normalvariate(name, mu, sigma, **kwargs)

Create a new parameter modeled by random.normalvariate(mu, sigma).

Backends which support normalvariate: hyperopt, random.

lognormvariate

BlackBoxOptimizer.lognormvariate(name, mu, sigma, **kwargs)

Create a new parameter modeled by random.lognormvariate(mu, sigma) such that the natural log is a normal distribution with mean mu and standard deviation sigma.

Backends which support lognormvariate: hyperopt, random.

rand

BlackBoxOptimizer.rand(name, *shape, **kwargs)

Create a new parameter modeled by numpy.random.rand(*shape), which creates a numpy array of the given shape with entries generated uniformly in [0, 1).

Backends which support rand: scikit-optimize, hyperopt, random.

randn

BlackBoxOptimizer.randn(name, *shape, **kwargs)

Create a new parameter modeled by numpy.random.randn(*shape), which creates a numpy array of the given shape with entries generated according to a standard normal distribution.

Backends which support randn: hyperopt, random.

Writing Your Own Backend

BBopt's backend system is built to be extremely extensible, allowing anyone to write and register their own BBopt backends. The basic template for writing a BBopt backend is as follows:

from bbopt.backends.util import Backend

class MyBackend(Backend):
    backend_name = "my-backend"
    implemented_funcs = [
        ...,  # list the random functions you support here
    ]

    def __init__(self, examples, params, **options):
        self.init_fallback_backend()

        # the values you want to use for this run as a dict;
        #  you can use params to get the args for each param
        #  and examples to get all the past data (to see what
        #  examples and params look like, use bb.get_data)
        self.current_values = ...

MyBackend.register()
MyBackend.register_alg("my_alg")

Once you've written a BBopt backend as above, you simply need to import it to trigger the register calls and enable it to be used in BBopt. For some example BBopt backends, see BBopt's default backends (written in Coconut):

About

The easiest hyperparameter optimization you'll ever do.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.2%
  • Makefile 0.8%