-
Notifications
You must be signed in to change notification settings - Fork 2
/
progress_report.tex
21 lines (16 loc) · 3.35 KB
/
progress_report.tex
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
\documentclass[•]{article}
\title{ML for Humanities: Progress Report}
\begin{document}
\maketitle
\paragraph{Introduction} The following is a short description of the tasks we were able to complete while following the Coursera course as well as the obstacles we encountered along the way. There are two reasons for giving this description: First, to give Arianna and Veruska a better overview over what the group of students has done. Second, to give some pointers as to what worked and what did not in case the project \emph{ML for Humanities} should become a regular thing. The report proceeds by week, first specifying what we planned to do and then what we actually managed to do and what difficulties arose.
\paragraph{Week 1} \underline{What we planned to do:}\\
Follow weeks one to three of the online course. This includes the following topics: Introduction to ML in general, Introduction to Linear Algebra, Simple Linear Regression, Multivariate Linear Regression, Logistic Regression.\\
We planned to implement three things: The Simple Linear Regression, the Multivariate Linear Regression and the Logistic Regression\\
\underline{What we did:} We followed the first three weeks of the course individually. We mostly managed to implement the Simple and Multivariate models, but only parts of the Logistic Regression because we ran out of time.
\underline{conclusions:} For some, it turned out to be necessary to come back to the material on logistic regression when working on neural networks, especially multiclass categorization and regularization. Overall, for future versions of the course, special focus should be put on logistic regression since it is fundamental for the whole material on neural networks.
\paragraph{Week 2} \underline{What we planned to do:}\\
Follow weeks four, five and seven of the online course. This includes the following topics: Introduction to Neural Networks, Representation in Neural Networks, Training Neural Networks, Support Vector Machines. Additionally, we decided to try out pairwise coding as a working technique. We formed groups of two people each; one of the partners would code while the other had the task of supervising the coding process.\\
We planned to implement three things: First, toy NN which is able to represent binary logical operators, when provided with the correct parameters/weights beforehand. Second, an implementation of the same toy NN that would be able to learn parameters from labeled training data. Third, a Support Vector Model.\\
\underline{What we did:} We followed all the course work. Both groups were able to code up a working NN for all binary logical operators. Both groups got stuck or did not get to the task of training the NN.
\underline{conclusions:} The material for this week tunred out to be difficult to master within one week mainly for one reason: for the assignments, a lot of code taking care of tasks such as data preparation was already provided in matlab. Implementing this in python would have meant a substantial amount of additional work. For future versions of this course it would therefore be advantageous to either use matlab throughout or have equivalents of the provided code implemented in python. Some of us decided to put additional focus on neural networks and continue working on it throughout week 3. This might result in such code that could be used for future versions of this course.
\end{document}