These are the notebooks from my Introduction to Machine Learning class. I learned a variety of ML methods, including regression, classification, decision trees, non parametric methods, clustering, and recommender systems.
I got familiar with Sklearn, Numpy/Pandas, and Pytorch.
The most interesting notebook is Kaggle_demo.ipynb, where I participated in a Kaggle competition and built a classification model start to finish by myself. I chose a Random Forest model, and selected the maximum depth of the trees based on accuracy on the validation data. The model classified whether or not a student would succesfully finish an online course with 0.975% accuracy.