Neural Networks are very powerfull tools of machine learning which can be used mainly for various Prediction, Classification, Generative applications.
The Neural Networks are Universal Function Approximators.
They can be visualized as Multiple Logistic Regression Units Drawing Decision Boundaries In Hyper Space or Spatial Tranformations i.e. Representaions of Data Various Waays in Hyper Space or as Probability Ditribution Approximators.
This repo. contains implementation of general neural network from scratch in pure python and numpy.
-
Modular: Seperate neuron, layer, nn classes with their own forward and backward functions:
-
General: NN's of any custom shape can be created
-
Supports:
-
Activations: Linear, ReLU, Sigmoid, Softmax (Why we need activations)
-
Loss: MSELoss, BCELoss, CrossEntropy/NLL Loss
-
Optimizers: SGD, Momentum SGD, RMSprop, Adam
-
-
Supports Visualization:
-
NN layers as spatial Transformations(Only layers with 2 or 3 layers are supported for visualization)
-
Decision Boundary
-
- numpy
- matplotlib(For visualization if required)
- sklearn(optional required only to generate spiral data for in spiral.py)
-
XOR data fitting and visualization:
python xor.py
-
Spiral data fitting and visualization python spiral.py
-
To build custom network
import nn.nn as nn
Define neural network as:
net = nn(shape=[in_features, hidden1, hidden2, ..., hiddenN, out_layer], activations=[act_fn for all hidden layers and output layer], viz=False)
Training Loop:
# input and targets should be of shape (Batch_size, n) for each epoch: for each iteration: net.zero_grad() output, loss = net(input_batch, target_batch) net.adam(lr=learning_rate)
For more refer XOR.py
To be added
Optimizers are implemented as member functions of nn class. In future Optimizers should be implemented as seperate class which takes nn parameters as inputs.