Skip to content

paviabera/vboost

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

36 Commits
 
 
 
 
 
 
 
 

Repository files navigation

vboost

code for Variational Boosting: Iteratively Refining Posterior Approximations

Abstract

We propose a black-box variational inference method to approximate intractable distributions with an increasingly rich approximating class. Our method, termed variational boosting, iteratively refines an existing variational approximation by solving a sequence of optimization problems, allowing the practitioner to trade computation time for accuracy. We show how to expand the variational approximating class by incorporating additional covariance structure and by introducing new components to form a mixture. We apply variational boosting to synthetic and real statistical models, and show that resulting posterior inferences compare favorably to existing posterior approximation algorithms in both accuracy and efficiency.

Authors: Andrew Miller, Nick Foti, and Ryan Adams.

Requires

  • autograd + its requirements (numpy, etc). Our code is compatible with this autograd commit or later. You can install the master version with pip install git+git://github.com/HIPS/autograd.git@master.
  • pyprind
  • sampyl for MCMC experiments

About

code supplement for variational boosting (https://arxiv.org/abs/1611.06585)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.3%
  • R 0.7%