
Course unit
MACHINE LEARNING (Numerosita' canale 1)
INP6075419, A.A. 2017/18
Information concerning the students who enrolled in A.Y. 2017/18
ECTS: details
Type 
ScientificDisciplinary Sector 
Credits allocated 
Educational activities in elective or integrative disciplines 
INGINF/04 
Automatics 
3.0 
Core courses 
INGINF/05 
Data Processing Systems 
3.0 
Course unit organization
Period 
First semester 
Year 
1st Year 
Teaching method 
frontal 
Type of hours 
Credits 
Teaching hours 
Hours of Individual study 
Shifts 
Lecture 
6.0 
48 
102.0 
No turn 
Examination board
Board 
From 
To 
Members of the board 
3 A.A. 2018/2019 
01/10/2018 
15/03/2020 
CHIUSO
ALESSANDRO
(Presidente)
VANDIN
FABIO
(Membro Effettivo)
ZANUTTIGH
PIETRO
(Supplente)

2 A.A. 2017/2018 
01/10/2017 
15/03/2019 
CHIUSO
ALESSANDRO
(Presidente)
VANDIN
FABIO
(Membro Effettivo)
PILLONETTO
GIANLUIGI
(Supplente)
ZORZI
MATTIA
(Supplente)

1 A.A. 2016/2017 
01/10/2016 
15/03/2018 
CHIUSO
ALESSANDRO
(Presidente)
VANDIN
FABIO
(Membro Effettivo)
PILLONETTO
GIANLUIGI
(Supplente)
ZORZI
MATTIA
(Supplente)

Prerequisites:

Basic Knowledge of Probability Theory, Statistics and Linear Algebra 
Target skills and knowledge:

The aim of this course is to provide the fundamentals and basic principles of the learning
problem as well as to introduce the most common algorithms for regression and classification. Both supervised as well as unsupervised learning will be covered, with possibly a brief outlook into more advanced and modern topics such as sparsity and
boosting. The course will be complemented by handson experience through
computer simulations. 
Examination methods:

Written test and takehome computer simulations 
Assessment criteria:

Knowledge of the basic tools for prediction and classification Analytical and practical
ability in the use of these tools for the solution of basic problems. 
Course unit contents:

Motivation; components of the learning problem and applications of Machine Learning.
Supervised and unsupervised learning.
PART I: Supervised Learning
Introduction: Data – Classes of models  Losses
Probabilistic models and assumptions on the data
Models, Losses and the regression function. Regression and Classification
When is a model good? Model complexity, bias variance tradeoff/generalization (VC dimension – generalization error)
Least Squares, Maximum Likelihood and Posteriors.
Models for Regression
Linear Regression (scalar and multivariate) – Stein’s paradox and Regularization – Subset Selection
Linearintheparameters models, Regularization.
Local and global models (Smoothing Kernels and NNR)
Dimensionality Reduction: Principal Component Regression, Partial Least Squares.
Classes of non linear models: Sigmoids, Neural Networks
Kernel Methods: SVM
Models for Classification
Linear Discriminant Analysis, Logistic Regression, NN, Perceptron, Naïve Bayes Classifier, SVM
Validation and Model Selection
Generalization Error, BiasVariance Tradeoff, Cross Validation, SURE.
Model complexity determination
PART II: Unsupervised learning
Cluster analysis: Kmeans Clustering, Mixtures of Gaussians and the EM estimation
Dimensionality reduction: Factor analysis, Principal Component Analysis (PCA) 
Planned learning activities and teaching methods:

Theoretical classes and problem solving sessions. Computer Simulations (in the lab). 
Additional notes about suggested reading:

The course will be based on the three textbooks
"Pattern Recognition and Machine Learning",
"The Elements of Statistical Learning",
“Understanding machine learning: From theory to algorithms”
and
"Machine Learning A Probabilistic Perspective"
(see Section "Testi di Riferimento"). 
Textbooks (and optional supplementary readings) 

T. Hastie, R. Tibshirani, J. Friedman, The Elements of Statistical Learning. : Springer, 2008.

C.M. Bishop, Pattern Recognition and Machine Learning. : Springer, 2006.

ShalevShwartz, S. and Shai BenDavid, Understanding machine learning: From theory to algorithms. : Cambridge University Press, 2014.

K.P. Murphy, Machine Learning A Probabilistic Perspective. : MIT Press, 2012.


