Course Material for Artificial Intelligence and Machine Learning - Unit 2 @ Computer Science Dept, Sapienza Bachelor in Applied Computer Science and Artificial Intelligence
Taught in English 🇺🇸
II year, II semester, ACSAI degree (bachelor)
Description: The course is meant to introduce machine learning, a class of methods that learn patterns from data and perform predictions on new data. This Unit complements the Unit I course on AI and formal logic. In terms of structure and philosophy, I would balance the amount of theoretical content (knowing what you are doing) with practical sessions (knowing how to do it). Note this is an introductory course to Machine Learning (ML) and is not an entire course on “Deep Learning” (DL), although some basic DL concepts are introduced and covered. The course emphasizes the theory part, whereas this latter is always accompanied by practical applications on some toy data to let the students understand in practice what happens (this means sometimes implementing the underlying basic algorithm from scratch). When possible, the course will make a few connections to recent research papers in AI&ML e show some interesting AI applications.
- Introducing you to the basic principles of Machine Learning
- Knowledge on the main learning modalities (supervised, unsupervised, parametric/non parametric)
- Develop awarness of the mathematical tools behind.
- Setting strong foundations for more advances courses (i.e. Deep Learning)
- Develop critical thinking/raise next generation of scientists
- Show a few cool, practical applications
Every year I ask students attending the class to fix bugs or typos that may find in the slides. Given that the material is shown with a Jupyter Notebook, they can fork the repository, fix the issues and make a pull request. Doing so, students get bonus points for the final exam, up to a maximum bonus. The list below shows all the people that contributed to the repository. Thank you!
- Linear algebra: vector/matrix manipulations (geometry in high dimensions)
- Calculus: partial derivatives (cost function, gradients)
- Probability: common distributions; bayes Rule (learn how NOT thinking deterministic)
- Statistics: mean/median/mode; maximum likelihood
We will review these in the first lectures