Skip to content

Latest commit

 

History

History
47 lines (36 loc) · 1.62 KB

README.md

File metadata and controls

47 lines (36 loc) · 1.62 KB

Machine-Learning-with-Tree-Based-Models-in-Python

01 Decision Tree Regression (Theory)

  • Non parametric algo
  • find descriptive features contain most information about target
  • split those feature to get pure subset
  • learn simple decision rule of a target variable

02 Classification Tree (Theory)

  • Decision tree used for Classification Problem

03 Entropy, Information Gain & Gini Index (Theory)

  • Which feature to select first
  • Entropy measure purity of split :
    • [-p(one class) * log(p(one class))] - [p(second class) * log(p(second class))]
  • Information Gain is collection of all entropy from root node to leaf node
  • Gini Index : Calculate purity of split also
    • 1- [P^2]
  • Gini Impurity ranges from 0 - 0.5, Entropy ranges from 0 - 1

04 Decision Tree Classification (Theory)

  • Decision tree used for Classification Problem

05 Decision Tree Classification (Python Code)

  • Step by Step Python code to visualize Regression Tree

06 Decision Tree Classification (Python Code)

  • Step by Step Python code to visualize Classification Tree

07 Random Forest and Ensemble Technique (Theory)

  • Bagging Technique
  • Collection of Decision Tree
  • Variable Importance measure

08 Voting Classifier (Theory)

  • Hard Voting : Predict output class with highest majority of Voting
  • Soft Voting : Predict output class with average of probability given to the class

09 Random Forest (Python Code)

  • Step by Step Python code for Random Forest Tree

10 Grid Search (Python Code)

  • Hyperparameter Tuning
  • Cross Validation
  • Grid Search with Cross Validation

11 Interview Question Decision Tree & Random Forest