Machine Learning


  • 03/10/2019: Starting date
  • 07/10/2019 - 13/10/2019: No Classes
  • 17/10/2019: No Class (Hackatober)
  • 4/11/2019: Slides uploaded

Teacher:

ESSE3 Link

Class schedule:

  • Tuesday 11:00 - 13:00 - AB1
  • Thursday 9:00 - 11:00 - LAB2

Students Office hours:

  • Thursday from 11:00 to 13:00

KNOWLEDGE AND UNDERSTANDING

The aim of the course is to provide the student with knowledge and skills in the area of machine learning.
At the end of the course the student should be able to:

  • distinguish the various machine learning paradigms;
  • know the learning theory
  • know classification algorithms, regression, clustering and dimension reduction;

APPLYING KNOWLEDGE AND UNDERSTANDING
After completing the course, the student must demonstrate that he is able to:

  • apply the different machine learning paradigms
  • implement the classification, regression, clustering and dimensionality reduction algorithms;
  • design and implement systems able to learn automatically from real data and situations;

COMMUNICATION SKILLS
At the end of this training activity, the student will be able to express himself clearly and with appropriate terms, using the English language, in the learning discussions as well as expose the results of a research concerning technical aspects of machine learning.

LEARNING SKILLS At the end of this training activity the student will be able to:

  • Finding and learning the innumerable algorithms and techniques that are presented in the field of machine learning
  • Implementing and using the new algorithms

  • Notions of Probability and Linear Algebra
  • Supervised Learning
  • Linear Models and Regression for Classification
  • Decision Tree, Random Forest, Ensemble
  • Unsupervised Learning
  • Dimensionality Reduction
  • Clustering
  • Neural networks (an introduction) : FFNN, RNN, SOM
  • Balanced and unbalanced data
  • Evaluation Metrics (AUC, ROC, Confusion Matrix…)
  • Markov Chains and Hidden Markov Models
  • Examples of coding in Python and R
  • Probabilistic learning theory and the “learning problem”
  • The VC-dimension (Proof of the maximum margin)

Course Slides

Reference books

  • C.M. Bishop, Pattern Recognition and Machine Learning, Springer - 2006
  • D. Barber, Bayesian Reasoning and Machine Learning, Cambridge University Press. - 2012 Online version
  • T. Hastie, R. Tibshirani, J. Friedman, The Elements of Statistical Learning, Springer - 2008 Online version

Other sources Some of the material is taken from the ML Course of Caltech given by Prof. Yaser Abu Mostafa and his collaborators

Additional Material


Exam Dates A.Y. 2015/2016

  • Winter session dates here
  • Summer session dates here
  • Autumn session dates here
  • Winter session dates here (2016)

Exam rules: The exam consists in two sessions: a written session and an oral one. The oral session is reserved for people who pass the written part with a mark greater or equal to 18.

Exam Results

  • N/A

Syllabus

  • Learning Probabilistic Theory
  • Learning paradigms
  • Classification
  • Clustering
  • Neural Networks
  • Support Vector Machines
  • Practical issues for ML

Material