Skip to content

Latest commit

 

History

History
55 lines (45 loc) · 1.58 KB

README.md

File metadata and controls

55 lines (45 loc) · 1.58 KB

FIRST WEEK:

  1. Basic building blocks of Neural Network
  2. Perceptron
  3. Neurons
  4. Hidden Layers
  5. Linear regression with Neural Networks
  6. Logistic regression with Neural Networks
  7. No Linear Activation Function
  8. tanh, step, logit, relu, elu
  9. Back propagation
  10. Vanishing and Exploding gradient descent
  11. Ways to avoid Vanishing and Exploding gradient descent
  12. How to mitigate over fitting ?
  13. Tensorflow - Keras practical

SECOND WEEK:

  1. Parameter explotion in image recognition
  2. Convolution layer - kernel , filter, Stride, Padding, feature map
  3. Pooling Layer - max, min, average
  4. CNN architecture
  5. Keras implementation
  6. Image recognition in comparison with Basis NN and CNN
  7. Advanced Deep CNN
  8. Pre Trained Models
  9. Transfer Learning - Resnet50
  10. Image Agumentation
  11. Tensor board
  12. Opencv, Yolo3
  13. Sample Hackathon

THIRD WEEK:

  1. Neural Network so far can only know what was passed in current time
  2. What if we want to remember last output to predict the future if it is a sequence data
  3. Neuron with memory
  4. RNN architecture
  5. Back Propagation Through Time (BPTT)
  6. Problem with BPTT
  7. Vanishing and Exploding gradient descent
  8. Truncated BPTT
  9. LSTM
  10. LSTM Architecture
  11. Keras LSTM implementation

References: https://github.com/omerbsezer/LSTM_RNN_Tutorials_with_Demo#SampleStock https://github.com/fchollet/deep-learning-with-python-notebooks/blob/master/8.1-text-generation-with-lstm.ipynb https://github.com/dipanjanS/nlp_workshop_odsc19 https://github.com/buomsoo-kim/Easy-deep-learning-with-Keras