Skip to content

MO-DAGAN: Minority Oversampling using Data Augmented GANs

License

Notifications You must be signed in to change notification settings

Olivier-tl/MO-DAGAN

Repository files navigation

MO-DAGAN: Minority Oversampling using Data Augmented GANs

Abstract: Class imbalance is a common problem that reduces the performance of classification models. One typical solution is to oversample the minority class. However, classical oversampling techniques such as SMOTE or ADASYN are ill-suited for deep learning approaches since they work in feature space. Recently, Generative Adversarial Networks (GANs) have been successfully used to generate artificial training data to re-balance datasets. Nevertheless, these approaches are data hungry and it remains a challenge to train GANs on the limited data of the minority class. In this work, we plan to leverage recent advances in data-efficient GAN training to advance the state of the art in oversampling approaches.

Dataset

IR
MNIST

10


50


100
Fashion-MNIST
10


50


100
CIFAR10

10


50


100
SVHN

10


50


100
EfficentNet - - - - - - - - - - - -
EfficentNet + Oversampling - - - - - - - - - - - -
EfficientNet + WGAN - - - - - - - - - - - -
EfficientNet + WGAN + ADA - - - - - - - - - - - -

Getting started

Install the required dependencies:

pip install -r requirements.txt

Run the following for training the GAN:

python main.py --config_path=configs/gan.yaml

Run the following for training the classification model:

python main.py --config_path=configs/classification.yaml

Refs

  • Training Generative Adversarial Networks with Limited Data [paper][code]
  • EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks [paper][code]

About

MO-DAGAN: Minority Oversampling using Data Augmented GANs

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published