Skip to content

amaynez/GenericNeuralNetwork

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Apr 16, 2021
88bc7d6 · Apr 16, 2021

History

41 Commits
Apr 16, 2021
Apr 16, 2021
Apr 16, 2021
Apr 16, 2021
Apr 16, 2021
Feb 12, 2021
Apr 16, 2021
Mar 20, 2021
Mar 20, 2021
Mar 19, 2021
Mar 19, 2021
Apr 16, 2021
Feb 14, 2021
Mar 19, 2021

Repository files navigation

Generic Neural Network Python Library

Language Module Module Release

  • only supports sigmoid activation
  • n fully connected sequential layers (dense)
  • MSE loss
  • stochastic gradient descent

NOTE: the code for this library has been greately enhanced for a subsequent project link here

This program creates a neural network programmatically with the following parameters:

  • number of inputs
  • number of neurons in hidden layer 1, ..., number of neurons in hidden layer n
  • number of outputs
  • learning rate

Once created the Neural Network has two functions:

  • Forward Propagation: to generate a prediction or guess based on the inputs
  • Train: to modify the inner weights and biases based on given inputs and target outputs

For testing purposes the XOR algorithm is implemented in the main.py script.

TO DO:

  • multiple activation functions (ReLu, linear, Tanh, etc.)
  • multiple optimizers (Adam, RMSProp, SGD Momentum, etc.)
  • batch and epoch training schedules
  • save and load trained model to file