- Implementation of the KNN algorithm in python.
- Gaussian Naive Bayes classifier in python.
- Implementation of the Perceptron and the perceptron learning rule for 3-dimensional binary inputs (plus a constant bias input).
- OR-perceptron.
- XOR-perceptron.
- Feed-Forward Neural Networks.
- Percepton-learning-rules.
- Randomly assigned weights in the range for a fully-connected 2-layer feed-forward neural network with sigmoid functions as activation functions.
- SVM_solver(e.g. MatLab’s fitcsvm function) to learn the linear SVM parameters.
- Implementation of a self-training system using a logistic regression classifier:- Semi Supervised Classifier
- Implementation of the polynomial fit solver for 2-dimensional input data as a linear regression learner. Make sure your implementation can handle polynomial fits of different order (at least to 4th order).
- Consider the problem where we want to predict the gender of a person from a set of input parameters, namely height, weight, and age. Implement Logistic regression to classify this data (use the individual data elements, i.e. height, weight, and age, as features).
- Implementation of Linear Discriminant Analysis.
- Consider the problem where we want to predict whether we are going to win a game of Tic-Tac-Toe from the current board configuration. To make this decision we have access to the state of the board in the form of 9 attributes reflecting the locations on the board, each one with 3 possible values (x, o, b) representing the two players or blank, respectively. There is a training and a test data set for this problem.
a) Show the construction of a 2 level decision tree using minimum Entropy as the construction criterion on the training data set. You should include the entropy calculations and the construction decisions for each node you include in the 2-level tree.
b) Implement a decision tree learner for this particular problem that can derive decision trees with an arbitrary, pre-determined depth (up to the maximum depth where all data sets at the leaves are pure) using the information gain criterion.
c) Apply the tree from part b) to the test data set for all possible tree depths (i.e. 1 - 9) and compare the classification accuracy on the test set with the one on the training set for each. For which depths does the result indicate overfitting ?
- Using the data and decision tree algorithm from the above(14) problem, chose a decision tree depth that does not overfit but achieves some baseline classification performance (but at least depth 4) and apply bagging to the problem
- Using the data and decision tree algorithm from Problem 14 and the depth chosen for problem 15, apply boosting to the problem:- Implement AdaBoost on top of your decision tree classifier.