# Machine Learning

## News

- 20/09/2021 Webex Room for streaming: https://unicam.webex.com/meet/marco.piangerelli
`01/11/2021 Holiday`

`02/11/2021 Lesson Canceled`

`29/11/2021 Lesson Canceled`

## General Info

**Teacher**:

**ESSE3 Link**

**Scheduling of Lectures**:

- Monday 09:00 - 11:00 - AB1
- Tuesday 16:00 - 18:00 - AB1

**Students Office hours**:

- Tuesday from 09:00 to 11:00

**Degrees**:

## Course Objectives

KNOWLEDGE AND UNDERSTANDING

The aim of the course is to provide the student with knowledge and skills in the area of machine learning.

At the end of the course the student should be able to:

- distinguish the various machine learning paradigms;
- know the learning theory
- know classification algorithms, regression, clustering and dimension reduction;

APPLYING KNOWLEDGE AND UNDERSTANDING

After completing the course, the student must demonstrate that he is able to:

- apply the different machine learning paradigms
- implement the classification, regression, clustering and dimensionality reduction algorithms;
- design and implement systems able to learn automatically from real data and situations;

COMMUNICATION SKILLS

At the end of this training activity, the student will be able to express himself clearly and with appropriate terms, using the English language, in the learning discussions as well as expose the results of a research concerning technical aspects of machine learning.

LEARNING SKILLS At the end of this training activity the student will be able to:

- Finding and learning the innumerable algorithms and techniques that are presented in the field of machine learning
- Implementing and using the new algorithms

## Syllabus

- Notions of Probability and Linear Algebra

- Supervised Learning

- Linear Models and Regression for Classification

- Decision Tree, Random Forest, Ensemble

- Unsupervised Learning

- Dimensionality Reduction

- Clustering

- Neural networks (an introduction) : FFNN, RNN, SOM

- Balanced and unbalanced data

- Evaluation Metrics (AUC, ROC, Confusion Matrix…)

- Examples of coding in Python and R

- Probabilistic learning theory and the “learning problem”

- The VC-dimension

- Support Vector Machines (Proof of the maximum margin)

## Material

**Course Slides and video**

- 20/09/2021 Lezione0
- 21/09/2021 linear_regression_notes
- 27/09/2021 Lezione2
- 28/09/2021 See 27/09/2021
- 04/10/2021 - Notes Gradient Descent (convergence)
- 18/10/2021 Lesson-8
- 19/10/2021 Lezione9
- 25/10/2021 Notes_classification_trees
- 26/10/2021 Notes_regression_trees
- 01/11/2021
`HOLIDAY`

- 02/11/2021
`CANCELED`

- 08/11/2021 Lezione12
- 09/11/2021 Lezione13
- 15/11/2021 Lezione14
- 16/11/2021 –> 07/12/2021
- 22/11/2021 –> 07/12/2021
- 23/11/2021 –> 07/12/2021
- 29/11/2021
`CANCELED`

- 30/11/2021 Lezione18
- 06/12/2021 not recorded for technical problems
- 07/12/2021 Lezione19
- 13/13/2021 Lezione20

** Link to Recorded Lessons **

**Reference books**

- C.M. Bishop, Pattern Recognition and Machine Learning, Springer - 2006
- D. Barber, Bayesian Reasoning and Machine Learning, Cambridge University Press. - 2012 Online version
- T. Hastie, R. Tibshirani, J. Friedman, The Elements of Statistical Learning, Springer - 2008 Online version
- G. James, D. Witten T. Hastie, R. Tibshirani, J. Friedman, An Introduction to Statistical Learning - 2nd Edition, Springer - 2014 Online version

**Other sources**
Some of the material is taken from the ML Course of Caltech given by Prof. Yaser Abu Mostafa and his collaborators

** Additional Material **

- 11/10/2020 Vapnik paper
- 27/12/2021 Hoeffding's
- 28/09/2019 Optimization_notes
- 28/09/2019 SVM1
- 28/09/2019 SVM2
- 28/09/2019 SVM3
- 03/11/2021 Explaining AdaBoost- Schapire
- 03/11/2021 Ensemble Methods - Zhou
- 28/12/2021 Bias-Variance I
- 28/12/2021 Bias-Variance II
- 28/12/2021 Generalization Theory
- 28/12/2021 Neural Networks Theory

*EXAMS*