This project intends to show the ways we can perform feature selection on our data.
- Feature Selection with Filtering Method
- Constant, Quasi Constant and Duplicate Feature Removal
- Correlated Feature Removal
- Feature Selection Based on Univariate ROC_AUC for Classification and MSE for Regression
- Feature Selection Based on Mutual Information (Entropy) Gain for Classification and Regression
- Feature Selection Based on Univariate (ANOVA) Test for Classification
- Feature Selection using Fisher Score and Chi2 (χ2) Test
- Feature Dimention Reduction Using LDA and PCA with Python | Principal Component Analysis in Feature Selection
- Step Forward, Step Backward and Exhaustive Feature Selection | Wrapper Method
- Use of Linear and Logistic Regression Coefficients with Lasso (L1) and Ridge (L2) Regularization for Feature Selection
- Recursive Feature Elimination (RFE) by Using Tree Based and Gradient Based Estimators