Skip to content

Latest commit

 

History

History
 
 

1_ActivationFunction_WeightMethods

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 

Activation Functions and Weight Initialisation Methods

This repository contains the code for analyzing the effect of activations and weight initialisation methods on deep neural network.

Outline of the Notebook

  • Setup Packages
  • Generate data
  • Write a feedforward class
  • Analyze the activation functions and weight initialization methods

Methodology

The way we analyze the effect of activations and weight initialisation methods on deep neural network is, first we will generate non-linearly separable data with two classes and write our simple feedforward neural network that supports all the activation functions and weight initialization methods. Then compare the different scenarios using loss plots

Jump into code

  • Click here to execute the code directly in colab

    Click here to open in Colab

Blog posts

Related blog posts for better understanding of the code in this repository:

Theory

Code Implementation Walkthrough