Skip to content

Devanshusisodiya/scratch-nn

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

Analysis of MLP and Regressor Performance Over Multiple Optimizers

This is a Neural Network I created from scratch to get a better understanding of the concept. I documented everything, in an easy to follow document with bite sized chunks of the concepts that were required to build this network.

The dataset used was the MNIST handwritten digits dataset, and the network has the below illustrated architecture.

image

I trained the network using 2 different optimizers - Gradient Descent defined in gradient_descent function and Gradient Descent with Momentum defined in gradient_descent_wm function to study whether the convergence of the network.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published