
Course unit
MACHINE LEARNING (Numerosita' canale 2)
INP6075419, A.A. 2017/18
Information concerning the students who enrolled in A.Y. 2017/18
ECTS: details
Type 
ScientificDisciplinary Sector 
Credits allocated 
Educational activities in elective or integrative disciplines 
INGINF/04 
Automatics 
3.0 
Core courses 
INGINF/05 
Data Processing Systems 
3.0 
Mode of delivery (when and how)
Period 
First semester 
Year 
1st Year 
Teaching method 
frontal 
Organisation of didactics
Type of hours 
Credits 
Hours of teaching 
Hours of Individual study 
Shifts 
Lecture 
6.0 
48 
102.0 
No turn 
Start of activities 
02/10/2017 
End of activities 
19/01/2018 
Prerequisites:

Basic Knowledge of Probability Theory, Statistics and Linear Algebra 
Target skills and knowledge:

The aim of this course is to provide the fundamentals and basic principles of the learning problem as well as to introduce the most common algorithms for regression and classification. Both supervised as well as unsupervised learning will be covered, with possibly a brief outlook into more advanced and modern topics such as sparsity and boosting. The course will be complemented by handson experience through computer simulations. 
Examination methods:

Written test and takehome computer simulations. 
Assessment criteria:

Knowledge of the basic tools for prediction (regression and classification). Analytical and practical ability in the use of these tools for the solution of basic problems. 
Course unit contents:

Motivation; components of the learning problem and applications of Machine Learning. Supervised and unsupervised learning.
PART I: Supervised Learning
Introduction: Data – Classes of models  Losses
Probabilistic models and assumptions on the data
Models, Losses and the regression function. Regression and Classification
When is a model good? Model complexity, bias variance tradeoff/generalization (VC dimension – generalization error)
Least Squares, Maximum Likelihood and Posteriors.
Models for Regression
Linear Regression (scalar and multivariate) – Stein’s paradox and Regularization – Subset Selection
Linearintheparameters models, Regularization.
Local and global models (Smoothing Kernels and NNR)
Dimensionality Reduction: Principal Component Regression, Partial Least Squares.
Classes of non linear models: Sigmoids, Neural Networks
Kernel Methods: SVM
Models for Classification
Linear Discriminant Analysis, Logistic Regression, NN, Perceptron, Naïve Bayes Classifier, SVM
Validation and Model Selection
Generalization Error, BiasVariance Tradeoff, Cross Validation, SURE. Model complexity determination
PART II: Unsupervised learning
Cluster analysis: Kmeans Clustering, Mixtures of Gaussians and the EM estimation
Dimensionality reduction: Principal Component Analysis (PCA) 
Planned learning activities and teaching methods:

Theoretical classes and problem solving sessions. Computer Simulations (in the lab). 
Additional notes about suggested reading:

The course will be based on the four textbooks: "Machine Learninga probabilistic perspective", "Pattern Recognition and Machine Learning", "The Elements of Statistical Learning", “Understanding Machine Learning: from Theory to Algorithms” (see Section "Testi di Riferimento"). 
Textbooks (and optional supplementary readings) 

T. Hastie, R. Tibshirani, J. Friedman, The Elements of Statistical Learning.. : Springer, 2008.

C. M. Bishop, Pattern Recognition and Machine Learning. : Springer, 2006.

ShalevShwartz, Shai; BenDavid, Shai, Understanding machine learningfrom theory to algorithms.. : Cambridge University Press, 2014.

Murphy, Kevin P., Machine Learninga probabilistic perspective. : Mit press, 2012.


