====== Machine Learning ======
----
===== News =====
* 20/09/2021 Webex Room for streaming: https://unicam.webex.com/meet/marco.piangerelli
* ''01/11/2021 Holiday''
* ''02/11/2021 Lesson Canceled''
* ''29/11/2021 Lesson Canceled''
----
===== General Info =====
**Teacher**:
* [[https://docenti.unicam.it/pdett.aspx?ids=N&tv=d&UteId=207|Prof.ssa Emanuela Merelli]]
* [[https://marcopiangerelli.it|Prof. Marco Piangerelli]]
**ESSE3 Link**
* [[https://didattica.unicam.it/Guide/PaginaADErogata.do?ad_er_id=2021*N0*N0*S1*16859*9989&ANNO_ACCADEMICO=2021&mostra_percorsi=S|Machine Learning - AY 2021/22]]
**Scheduling of Lectures**:
* Monday 09:00 - 11:00 - AB1
*Tuesday 16:00 - 18:00 - AB1
**Students Office hours**:
*Tuesday from 09:00 to 11:00
**Degrees**:
* [[didattica:mscs|MSc in Computer Science (LM-18)]]
----
===== Course Objectives =====
KNOWLEDGE AND UNDERSTANDING
The aim of the course is to provide the student with knowledge and skills in the area of machine learning.\\
At the end of the course the student should be able to:
* distinguish the various machine learning paradigms;
* know the learning theory
* know classification algorithms, regression, clustering and dimension reduction;
APPLYING KNOWLEDGE AND UNDERSTANDING\\
After completing the course, the student must demonstrate that he is able to:
* apply the different machine learning paradigms
* implement the classification, regression, clustering and dimensionality reduction algorithms;
* design and implement systems able to learn automatically from real data and situations;
COMMUNICATION SKILLS\\
At the end of this training activity, the student will be able to express himself clearly and with appropriate terms, using the English language, in the learning discussions as well as expose the results of a research concerning technical aspects of machine learning.
LEARNING SKILLS
At the end of this training activity the student will be able to:
* Finding and learning the innumerable algorithms and techniques that are presented in the field of machine learning
* Implementing and using the new algorithms
----
===== Syllabus =====
* Notions of Probability and Linear Algebra
* Supervised Learning
* Linear Models and Regression for Classification
* Decision Tree, Random Forest, Ensemble
* Unsupervised Learning
* Dimensionality Reduction
* Clustering
* Neural networks (an introduction) : FFNN, RNN, SOM
* Balanced and unbalanced data
* Evaluation Metrics (AUC, ROC, Confusion Matrix...)
* Examples of coding in Python and R
* Probabilistic learning theory and the "learning problem"
* The VC-dimension
* Support Vector Machines (Proof of the maximum margin)
----
===== Material =====
**Course Slides and video**
* 20/09/2021 {{didattica:ay2122:ml:lezione0_locked.pdf|Lezione0}}
* 21/09/2021 {{didattica:ay2122:ml:linear_regression_locked.pdf|linear_regression_notes}}
* 27/09/2021 {{didattica:ay2122:ml:lezione1_locked.pdf|Lezione2}}
* 28/09/2021 See 27/09/2021
* 04/10/2021 - Notes Gradient Descent (convergence)
* 05/10/2021 {{didattica:ay2122:ml:lezione2_locked.pdf|Lezione3}} - {{didattica:ay2122:ml:svm_locked.pdf|svm_notes}}
* 11/10/2021 {{didattica:ay2122:ml:07kernels_protetto.pdf|kernel Notes I}} {{didattica:ay2122:ml:rbfkernel_protetto.pdf|kernel Notes II}} [[https://unicam.webex.com/unicam/ldr.php?RCID=c1db86adaef47cfe0220d2ef48571f95|Lesson-6]]
* 12/10/2021 {{didattica:ay2122:ml:backprop_locked.pdf|Note_backprop}} - {{didattica:ay2122:ml:lezione7_locked.pdf|Lezione7}} - [[https://unicam.webex.com/unicam/ldr.php?RCID=5a2fc84d53fa27cede891f293b4fd04d|Lesson-7]]
* 18/10/2021 [[https://unicam.webex.com/unicam/ldr.php?RCID=0f4c91088d113bd9495d79e7518e257e|Lesson-8]]
* 19/10/2021 {{didattica:ay2122:ml:lezione_8_9_locked.pdf|Lezione9}}
* 25/10/2021 {{didattica:ay2122:ml:classification_trees_locked.pdf|Notes_classification_trees}}
* 26/10/2021 {{didattica:ay2122:ml:regression_trees_locked.pdf|Notes_regression_trees}}
* 01/11/2021 ''HOLIDAY''
* 02/11/2021 ''CANCELED''
* 08/11/2021 {{didattica:ay2122:ml:lezione7_21_22_protetta.pdf|Lezione12}}
* 09/11/2021 {{didattica:ay2122:ml:lezione8_21_22_locked.pdf|Lezione13}}
* 15/11/2021 {{didattica:ay2122:ml:lezione9_21_22_locked.pdf|Lezione14}}
* 16/11/2021 --> 07/12/2021
* 22/11/2021 --> 07/12/2021
* 23/11/2021 --> 07/12/2021
* 29/11/2021 ''CANCELED''
* 30/11/2021 {{didattica:ay2122:ml:lezione13_21_22_locked.pdf|Lezione18}}
* 06/12/2021 not recorded for technical problems
* 07/12/2021 {{didattica:ay2122:ml:lezione9_21_22_protetto.pdf|Lezione19}}
* 13/13/2021 {{didattica:ay2122:ml:lezione10_21_22_protetto.pdf|Lezione20}}
** Link to Recorded Lessons **
* [[https://docs.google.com/spreadsheets/d/1B6sc42a2arXmUC0FdkJHnAg6bhXxDODKrSYRB_LBzNM/edit#gid=0|Link]]
**Reference books**
* C.M. Bishop, Pattern Recognition and Machine Learning, Springer - 2006
* D. Barber, Bayesian Reasoning and Machine Learning, Cambridge University Press. - 2012 [[http://web4.cs.ucl.ac.uk/staff/D.Barber/pmwiki/pmwiki.php?n=Brml.Online|Online version]]
* T. Hastie, R. Tibshirani, J. Friedman, The Elements of Statistical Learning, Springer - 2008 [[https://web.stanford.edu/~hastie/ElemStatLearn//printings/ESLII_print10.pdf|Online version]]
* G. James, D. Witten T. Hastie, R. Tibshirani, J. Friedman, An Introduction to Statistical Learning - 2nd Edition, Springer - 2014 [[https://web.stanford.edu/~hastie/ISLR2/ISLRv2_website.pdf|Online version]]
**Other sources**
Some of the material is taken from the ML Course of Caltech given by Prof. Yaser Abu Mostafa and his collaborators
** Additional Material **
* 23/09/2021 {{didattica:ay2122:ml:math4ml.pdf|Math for Machine Learning - no Bayesian Probability}}
* 11/10/2020 {{didattica:magistrale:ml:ay_1920:cortes-vapnik1995_article_support-vectornetworks.pdf|Vapnik paper}}
* 27/12/2021 {{didattica:ay2122:ml:hoeffding_protetto.pdf|Hoeffding's}}
* 28/09/2019 {{didattica:ay2122:ml:optimization_protetto.pdf|Optimization_notes}}
* 28/09/2019 {{didattica:magistrale:ml:ay_2021:svm_notes1.pdf|SVM1}}
* 28/09/2019 {{didattica:magistrale:ml:ay_2021:svm_notes2.pdf|SVM2}}
* 28/09/2019 {{didattica:magistrale:ml:ay_2021:svm_notes4.pdf|SVM3}}
* 03/11/2021 {{didattica:ay2122:ml:explaining_adaboost_schapire.pdf|Explaining AdaBoost- Schapire}}
* 03/11/2021 {{didattica:ay2122:ml:ensemble_methods_zhou.pdf|Ensemble Methods - Zhou}}
* 28/12/2021 {{didattica:ay2122:ml:bias_variance_protetto.pdf|Bias-Variance I}}
* 28/12/2021 {{didattica:ay2122:ml:bias_varianceii_protetto.pdf|Bias-Variance II}}
* 28/12/2021 {{didattica:ay2122:ml:vapnik_recap_protetto.pdf|Generalization Theory}}
* 28/12/2021 {{didattica:ay2122:ml:guilhoto_nn_for_math.pdf|Neural Networks Theory}}
*****EXAMS*****
* 28/12/2021 {{didattica:ay2122:ml:esame_ml_04_02_2019_protetto.pdf|Exam_04_02_2021}}{{didattica:ay2122:ml:ml_exam_5_02_2021_protetto.pdf|Exam_5_02_2021}}{{didattica:ay2122:ml:ml_exam_23_10_2020_protetto.pdf|Exam_23_10_2020}}
----
===== RESULTS =====
* ''14/01/2022'' {{didattica:ay2122:ml:ml_exam_14_01_2021-rsults.pdf|Results}}
* ''07/02/2022'' {{didattica:ay2122:ml:ml_exam_07_02_2022-results_protetto.pdf|Results}}
* ''27/06/2022'' {{didattica:ay2122:ml:ml_exam_27_06_2022_results.pdf|Results}}
----