This repo contains implementations of several recent deep generative models using the Python framework, Chainer. This code was developed to support the disseration laid out in dissertation.pdf.
- Maaløe, L., Sønderby, C.K., Sønderby, S.K. and Winther, O., 2016. Auxiliary deep generative models. arXiv preprint arXiv:1602.05473.
- Tomczak, J.M. and Welling, M., 2016. Improving Variational Auto-Encoders using Householder Flow. arXiv preprint arXiv:1611.09630.
- Burda, Y., Grosse, R. and Salakhutdinov, R., 2015. Importance weighted autoencoders. arXiv preprint arXiv:1509.00519.
- Kingma, D.P., Salimans, T. and Welling, M., 2016. Improving variational inference with inverse autoregressive flow. arXiv preprint arXiv:1606.04934.
- Rezende, D.J. and Mohamed, S., 2015. Variational inference with normalizing flows. arXiv preprint arXiv:1505.05770.
- Kingma, D.P., and Welling, M., 2013. Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114.
- Rezende, D.J., Mohamed, S., and Wierstra, D., 2014. Stochastic backpropagation and approximate inference in deep generative models. arXiv preprint arXiv:1401.4082.
The data used for experimentation is not supplied in this repo at this time but can be made available upon requests.
The following Python packages are required (Install using pip install --user <package>
):
pyyaml
h5py
scipy
docopt
chainer
Example training run (GPU id 0):
/train.py -g 0 -o 'demo_model' --model-type vae --vae-samples 1 --ntrans 1 --nlatent 16 --nhidden 512 --nlayers 4\
-b 16384 --batch-limit 1000 -t 1000000 --time-print 600 --epoch-sample 100 --log-interval 5 --data pose\
--init-temp 0 --temp-epoch 200 --init-learn 1e-4 --learn-decay 3e-3 --weight-decay 0 --init-model none