# Machine Learning

## News

**29/09/2020**: Starting date**27/10/2020 - NEWS**: Today's Lesson is canceled

## General Info

**Teacher**:

**ESSE3 Link**

**Class schedule**:

- Monday 09:00 - 11:00 - AB1
- Tuesday 16:00 - 18:00 - AB1

**Students Office hours**:

- Tuesday from 09:00 to 11:00

## Course Objectives

KNOWLEDGE AND UNDERSTANDING

The aim of the course is to provide the student with knowledge and skills in the area of machine learning.

At the end of the course the student should be able to:

- distinguish the various machine learning paradigms;
- know the learning theory
- know classification algorithms, regression, clustering and dimension reduction;

APPLYING KNOWLEDGE AND UNDERSTANDING

After completing the course, the student must demonstrate that he is able to:

- apply the different machine learning paradigms
- implement the classification, regression, clustering and dimensionality reduction algorithms;
- design and implement systems able to learn automatically from real data and situations;

COMMUNICATION SKILLS

At the end of this training activity, the student will be able to express himself clearly and with appropriate terms, using the English language, in the learning discussions as well as expose the results of a research concerning technical aspects of machine learning.

LEARNING SKILLS At the end of this training activity the student will be able to:

- Finding and learning the innumerable algorithms and techniques that are presented in the field of machine learning
- Implementing and using the new algorithms

## Course Contents

- Notions of Probability and Linear Algebra

- Supervised Learning

- Linear Models and Regression for Classification

- Decision Tree, Random Forest, Ensemble

- Unsupervised Learning

- Dimensionality Reduction

- Clustering

- Neural networks (an introduction) : FFNN, RNN, SOM

- Balanced and unbalanced data

- Evaluation Metrics (AUC, ROC, Confusion Matrix…)

- Markov Chains and Hidden Markov Models

- Examples of coding in Python and R

- Probabilistic learning theory and the “learning problem”

- The VC-dimension

- Support Vector Machines (Proof of the maximum margin)

## Study material

**Course Slides and video**

- 29/09/2020 Lezione0
- 05/10/2020 Lezione1Video_ML_05_10_2020
- 06/10/2020 Video_ML_06_10_2020
- 12/10/2020 Video_ML_12_10_2020
- 13/10/2020 Lezione2Video_ML_13_10_2020_svm
- 19/10/2020 Video_ML_19_10_2020_kernels
- 26/10/2020 Video_ML_26_10_2020_NNsLezione3
- 27/10/2020
**Canceled** - 02/11/2020
**Academic Holiday** - 03/11/2020 Lezione 4
- 09/11/2020 Video_ML_09_11_2020_TreesLezione 5
- 10/11/2020 Video_ML_09_11_2020_EnsembleMethodsLezione 6
- 16/11/2020 Lezione 7
- 17/11/2020 See 23/11/2020 for material
- 24/11/2020 ML_01_12_2020_Probabilistic_Learning_I
- 30/11/2019 ML_30_11_2020_Probabilistic_Learning_II
- 01/12/2020 ML_01_12_2020_Probabilistic_Learning_III
- 07/12/2020
**Academic Holiday** - 08/12/2020
**Academic Holiday** - 14/12/2020 ML_14_12_2020_Probabilistic_Learning_IV
- 15/12/2020 Probabilistic_Learning_V
- 22/12/2020 ML_22_12_2020_bias_variance_II

**Reference books**

- C.M. Bishop, Pattern Recognition and Machine Learning, Springer - 2006
- D. Barber, Bayesian Reasoning and Machine Learning, Cambridge University Press. - 2012 Online version
- T. Hastie, R. Tibshirani, J. Friedman, The Elements of Statistical Learning, Springer - 2008 Online version

**Other sources**
Some of the material is taken from the ML Course of Caltech given by Prof. Yaser Abu Mostafa and his collaborators

** Additional Material **

- 11/10/2020 Vapnik paper
- 11/10/2020 Hoeffding's
- 28/09/2019 Optimization_notes
- 28/09/2019 SVM1
- 28/09/2019 SVM2
- 28/09/2019 SVM3
- 12/11/2019 Explaining AdaBoost- Schapire
- 12/11/2019 Ensemble Methods - Zhou
- 12/12/2019 Bias-Variance I
- 12/12/2019 Bias-Variance II
- 16/12/2019 Vapnik's Inequality

## Exams

**Exam Dates A.Y. 2020/2021**

**Please, once inside your ESSE3 private account, select the dates with the labels in the following form “(Roman Number) Prova Parziale scritta”**

**Exam rules**:
The exam consists in two sessions: a written session and an oral one.
The oral session is reserved for people who pass the written part with a mark greater or equal to 18.

** Exam Results **

- N/A

## Introduction to Machine Learning (for Ph.D.)

** Syllabus **

- Learning Probabilistic Theory
- Learning paradigms
- Classification
- Clustering
- Neural Networks
- Support Vector Machines
- Practical issues for ML

** Material **