You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is a working implementation of a vectorized fully-connected neural network in NumPy
Backpropagation algorithm is implemented in a full-vectorized fashion over a given minibatch
This enables us to take advantage of powerful built-in NumPy APIs (and avoid clumsy nested loops!), consequently improving training speed
Backpropagation code lies in the method take_gradient_step_on_minibatch of class NeuralNetwork (see src/neural_network.py)
Refer to in-code documentation and comments for description of how the code is working
Repository structure
Directory src/ contains the implementation of neural networks
src/neural_network.py contains the actual implementation of the NeuralNetwork class (including vectorized backpropagation code)
src/activations.py and src/losses.py contain implementations of activation functions and losses, respectively
src/utils.py contains code to display confusion matrix
main.py contains driver code that trains an example neural network configuration using the NeuralNetwork class
Installation
To download MNIST data, install python-mnist through git clone method (run the script to download data; ensure python-mnist directory exists inside the root directory of this project)
Contents
Implement class NeuralNetwork
Implement common activation and loss functions
Test implementation on MNIST data
Results
1 hidden-layer (256 dimensional) with sigmoid activation on MNIST data