Skip to content

zhangtyzzz/gan-vae-pretrained-pytorch

 
 

Repository files navigation

Pre-trained GANs, VAEs + classifiers for MNIST / CIFAR10

  • includes model class definitions + training scripts
  • includes notebooks showing how to load pretrained nets / use them
  • tested with pytorch 1.0, python 3
  • generates images the same size as the dataset images
  • based on the official pytorch examples repo with modifications to generate the appropriate size

mnist

Generates images the size of the MNIST dataset (28x28), using an architecture based on the DCGAN paper. Trained for 100 epochs. Weights here.

dcgan samples vae samples data samples
fake_images-300 fake_images-300 real_images

For comparison with a less complicated architecture, I've also included a pre-trained non-convolutional GAN in the mnist_gan_mlp folder, based on code from this repo (trained for 300 epochs).

I've also included a pre-trained LeNet classifier which achieves 99% test accuracy in the classifiers/mnist folder, based on this repo.

cifar10

The cifar10 gan is from the pytorch examples repo and implements the DCGAN paper. It required only minor alterations to generate images the size of the cifar10 dataset (32x32x3). Trained for 200 epochs. Weights here.

generated samples data samples
fake_images-300 real_images

I've also linked to a pre-trained cifar classifier in the classifiers/cifar folder from this repo.

cifar100

Similiar to the above gans, the cifar100 gan here generates 32x32x1 images for generating grayscale images. Trained for 200 epochs. Weights here. There are also weights/code for generating images which are 34x45x1.

generated samples data samples
fake_images-300 real_images

reference

  • feel free to use/share this code openly
  • for similar projects, see some of my other repos: (e.g. acd) or my website (csinva.github.io)

About

Pretrained GANs + VAE + classifiers for MNIST/CIFAR in pytorch.

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Jupyter Notebook 82.1%
  • Python 17.9%