Skip to content

rrmina/Neural-Network-Experiments

Repository files navigation

Neural Network Experiments

A collection of codes and notebooks implementing various types of neural networks in Numpy. As expected, codes are not optimized, and are incapable of utilizing GPU power!

Fundamentals of Neural Networks

  1. Softmax Classification on MNIST

  2. Stochastic, mini-Batch, Batch Gradient Descent, and Dataloaders

  3. Optimizers (Moment, RMSProp, Adam)

  4. Regularization (L1, L2, Dropout, Batchnorm)

    • Code:
    • Notebook:
  5. Convolutional Neural Networks (CNN) on MNIST

    • Code:
    • Notebook:
  6. Recurrent Neural Networks (RNN) on MNIST

    • Code:
    • Notebook:
  7. Generative Adversarial Networks (GANs) on MNIST

    • Code:
    • Notebook:

Building Your Own Deep Learning Framework

  1. Sequential Layers

    • Code:
    • Notebook:
  2. Autograd

    • Code:
    • Notebook:

Application of Neural Neworks

  1. Linear and CNN Autoencoder on MNIST

    • Code:
    • Notebook:
  2. Sentiment Classification and Word Embedding on IMDB Movie Review Dataset

    • Code:
    • Notebook:
  3. Character-level RNN

    • Code:
    • Notebook:

Classical Machine Learning

  1. Principal Component Analysis (PCA)

About

Readable implementations of neural network concepts, mostly tested using MNIST and CIFAR-10 dataset

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages