Skip to content

uses MNIST handwriting dataset with standard 3 layer ANN with FP/BP and CG for cost minimization

Notifications You must be signed in to change notification settings

ahad-s/digit-recognition-ann

Repository files navigation

Standard ANN implementation using BackProp/ForwardProp with 1 hidden layer, current training setting has 300 hidden neurons. Classifier has >85% accuracy on all 2000 test examples with current training at 4000 examples.

Refer to bottom of ann.py on how to change initial parameters (regularization constant LAMBDA, neurons in hidden layer, etc.).

Total training examples available: 6000 Total test examples available: 2000

WARNING: Training takes a while and a good CPU is recommended for >1000 training examples with current hidden neurons. If you want to use the pre_trained data, change LOAD_FROM_FILE to False at the bottom of ann.py

About

uses MNIST handwriting dataset with standard 3 layer ANN with FP/BP and CG for cost minimization

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages