Skip to content

adarshvjois/HYPPO

 
 

Repository files navigation

HYPPO

HYPer-Parameter Optimization: Bayesian optimization using Gaussian processes for scikit-learn.

This is a library for doing hyperparameter tuning based on GP-based Bayesian Optimization in Machine Learning models, using the popular library scikit-learn.

How to use: Check out the file "slice_example.py" for an example.

Current features:

  • Does sequential Bayesian Optimization using GPs
  • Easy to use and implement for current scikit-learn users
  • Plots GPs and acquisition functions
  • Acquisition functions implemented include EI, PI, GP-UCB
  • Uses scikit-learn's Gaussian Processes module (which does not include Matern kernel as of now)

Future additions:

  • Installing via pip! (coming soon)
  • Parallel Bayesian Optimization using slice sampling (coming soon)
  • Custom Gaussian Processes (see gaussian_process folder) module including Matern32/52 kernels
  • Implementation of Tree Parzen Estimators
  • Multi-task Bayesian Optimization, input-warping
  • Detailed wiki (coming soon)

References:

To learn more about GPs, read the references above, or watch the amazing lectures from Prof. Nando de Freitas on the topic.

Authors: Adarsh Jois, Shivam Verma (New York University)

About

HYPer-Parameter Optimization: Bayesian optimization using Gaussian processes for scikit-learn.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 94.0%
  • Python 6.0%