Table of Contents
- Doc
- Random Forest
- Gradient boosting
- Dimensionality reduction
- Clustering
- benchmark
- Neural Network
- Deep Learning
- Optimization
- Very Brief Introduction to Machine Learning for AI
- Markov Chains
- What are the advantages of different classification algorithms?
- VisuAlgo - visualising data structures and algorithms through animation
- https://oj.leetcode.com/problems/
- My recommendations – SlideShare Presentations on Data Science
- 机器学习常见算法分类汇总
- A Tour of Machine Learning Algorithms | 机器学习算法之旅
- ** dive-into-machine-learning**
- Machine Learning Roadmap: Your Self-Study Guide to Machine Learning
- 如何选择机器学习算法
- 机器学习系列(4)_机器学习算法一览,应用建议与解决思路
- Detecting diabetic retinopathy in eye images
- Cheatsheet – Python & R codes for common Machine Learning Algorithms
- 10 种机器学习算法的要点(附 Python 和 R 代码) | Essentials of Machine Learning Algorithms (with Python and R Codes)
- useR-machine-learning-tutorial
- What is feature engineering?
Learning
- 看机器学习论文时,看不懂数学公式怎么办?
- 5 Techniques To Understand Machine Learning Algorithms Without the Background in Mathematics
- machine-learning-for-software-engineers
- What If I Am Not Good At Mathematics
- Powerful Guide to learn Random Forest (with codes in R & Python), Tuning the parameters of your Random Forest model
- awesome-random-forest
- A Complete Tutorial on Tree Based Modeling from Scratch (in R & Python)
- Boruta all-relevant feature selection method
- Parallelized Mutual Information based Feature Selection module
- CatBoost - CatBoost is an open-source gradient boosting on decision trees library with categorical features support out of the box for Python, R https://catboost.yandex
- LightGBM - A fast, distributed, high performance gradient boosting (GBDT, GBRT, GBM or MART) framework based on decision tree algorithms, used for ranking, classification and many other machine learning tasks. It is under the umbrella of the DMTK(http://github.com/microsoft/dmtk) project of Microsoft.
- XGboost - An optimized general purpose gradient boosting library. The library is parallelized, and also provides an optimized distributed version.
- A Guide to Gradient Boosted Trees with XGBoost in Python
- CatBoost vs. Light GBM vs. XGBoost, 入门 | 从结构到性能,一文概述XGBoost、Light GBM和CatBoost的同与不同
- annoy - Approximate Nearest Neighbors in C++/Python optimized for memory usage and loading/saving to disk
- pysparnn - Approximate Nearest Neighbor Search for Sparse Data in Python!
- benchm-ml - A minimal benchmark for scalability, speed and accuracy of commonly used open source implementations (R packages, Python scikit-learn, H2O, xgboost, Spark MLlib etc.) of the top machine learning algorithms for binary classification (random forests, gradient boosted trees, deep neural networks etc.).
- Learning How To Code Neural Networks
- A Neural Network in 11 lines of Python (Part 1)
- [人人都能用Python写出LSTM-RNN的代码![你的神经网络学习最佳起步]]。 English
- DeepLearning.TV
- 深度学习简明介绍
- A Deep Learning Tutorial: From Perceptrons to Deep Networks | 深度学习概述:从感知机到深度网络
- UFLDL Tutorial
- Deep Learning Tutorials
- 机器学习前沿热点–Deep Learning
- 7 Steps for becoming Deep Learning Expert 翻译
- Must Know Tips/Tricks in Deep Neural Networks (by Xiu-Shen Wei)
packages
- Evaluation of Deep Learning Frameworks
- Deep Machine Learning libraries and frameworks
- Deep Learning Libraries by Language
- mxnet - Lightweight, Portable, Flexible Distributed Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Go, and more
- 利用Theano理解深度学习——Logistic Regression,利用Theano理解深度学习——Multilayer Perceptron,利用Theano理解深度学习——Convolutional Neural Networks
- hyperas - Keras + Hyperopt: A very simple wrapper for convenient hyperparameter optimization http://maxpumperla.github.io/hyperas/
TensorFlow
- Spearmint - Spearmint Bayesian optimization codebase