First cycle
degree courses
Second cycle
degree courses
Single cycle
degree courses
School of Engineering
ICT FOR INTERNET AND MULTIMEDIA
Course unit
NEURAL NETWORKS AND DEEP LEARNING
INP9086459, A.A. 2019/20

Information concerning the students who enrolled in A.Y. 2019/20

Information on the course unit
Degree course Second cycle degree in
ICT FOR INTERNET AND MULTIMEDIA (Ord. 2019)
IN2371, Degree course structure A.Y. 2019/20, A.Y. 2019/20
N0
bring this page
with you
Degree course track ICT FOR LIFE AND HEALTH [004PD]
Number of ECTS credits allocated 6.0
Type of assessment Mark
Course unit English denomination NEURAL NETWORKS AND DEEP LEARNING
Department of reference Department of Information Engineering
E-Learning website https://elearning.dei.unipd.it/course/view.php?idnumber=2019-IN2371-004PD-2019-INP9086459-N0
Mandatory attendance No
Language of instruction English
Branch PADOVA
Single Course unit The Course unit can be attended under the option Single Course unit attendance
Optional Course unit The Course unit can be chosen as Optional Course unit

Lecturers
Teacher in charge ALBERTO TESTOLIN ING-INF/03

Mutuated
Course unit code Course unit name Teacher in charge Degree course code
SCP8082718 COMPUTATIONAL NEUROSCIENCE ALBERTO TESTOLIN SC2443
INP9086459 NEURAL NETWORKS AND DEEP LEARNING ALBERTO TESTOLIN IN2371

ECTS: details
Type Scientific-Disciplinary Sector Credits allocated
Core courses ING-INF/03 Telecommunications 6.0

Course unit organization
Period First semester
Year 1st Year
Teaching method frontal

Type of hours Credits Teaching
hours
Hours of
Individual study
Shifts
Lecture 6.0 48 102.0 No turn

Calendar
Start of activities 30/09/2019
End of activities 18/01/2020
Show course schedule 2019/20 Reg.2019 course timetable

Examination board
Examination board not defined

Syllabus
Prerequisites: The course relies on preliminary knowledge of mathematical analysis, linear algebra and probability theory. Familiarity with machine learning concepts is desired, though not mandatory. Python programming skills are required.
Target skills and knowledge: The course covers the theory and practice of artificial neural networks, highlighting their relevance both for artificial intelligence applications and for modeling human cognition and brain function. Theoretical discussion of various types of neural networks and learning algorithms is complemented by hands-on practices in the computer lab (PyTorch framework).
Examination methods: Evaluation of knowledge and abilities acquired will consist on an individual project assignment, which will be discussed during the oral exam. The project will require a software implementation of one or more computational models and analyses discussed during the course, along with a short essay in which the student will describe and discuss the project results. The oral exam will also include general theoretical questions related to the course content.
Assessment criteria: The evaluation is based on the understanding of course topics and the acquisition of the proposed concepts and methodologies.
Course unit contents: 1. Introduction: computational and mathematical modeling of neural systems; basics of neuroscience; levels of analysis in system neuroscience.
2. Single-neuron modeling: morphology, neuro-electronics, principles of synaptic transmission; integrate-and-fire models; the Hodgkin-Huxley model.
3. Principles of neural encoding: recording neuronal responses; spike trains, firing rates, local field potentials; tuning functions and receptive fields; efficient encoding principles and information compression.
4. Network modeling: neural network architectures; localistic, distributed, and sparse representations; examples from the visual system.
5. Learning, memory and plasticity: synaptic plasticity in biological systems (Hebb rule, LTP, LTD, STDP); synaptic plasticity in artificial neural networks and overview of machine learning basics.
6. Supervised learning: perceptron, delta rule, error backpropagation.
7. Supervised deep learning: advanced optimization methods for training multi-layer networks; convolutional architectures; transfer learning and multi-task learning.
8. Recurrent neural networks: backpropagation through time, long short-term memory networks.
9. Unsupervised learning: competitive networks; self-organizing maps; associative memories and Hopfield networks; autoencoders and Boltzmann machines.
10. Unsupervised deep learning: hierarchical generative models; generative adversarial networks.
11. Reinforcement learning: exploration-exploitation dilemma; temporal-difference learning; conditioning and dopamine circuits; deep reinforcement learning.
12. Case studies from neurocognitive modeling: visual perception; space coding; semantic cognition; complementary learning systems; hippocampus and experience replay.
13. Large-scale brain organization: structural and functional properties of brain networks; neuronal oscillations and spontaneous brain activity; neuromorphic hardware.
Planned learning activities and teaching methods: Teaching is based on frontal lectures covering the theory, and practice classes on neural network modeling in the computer lab. Interactive teaching techniques will be used, including think-pair-share and interactive discussions of a few minutes on open questions. This will enforce interactive learning and the ability to critically reflect on the concepts discussed.
Additional notes about suggested reading: All topics will be covered during the lectures. Slides will be made available on e-learning. Students' notes must be integrated with the reference books and with further material (mostly scientific articles) provided by the teacher on the e-learning platform.
Textbooks (and optional supplementary readings)
  • Goodfellow, I., Bengio, Y., and Courville, A., Deep Learning. --: MIT Press, 2016. Electronic version freely available online Cerca nel catalogo
  • Dayan, P., and L. F. Abbott, Theoretical neuroscience. --: MIT Press, 2001. Electronic version freely available online Cerca nel catalogo
  • Hertz, J., Krogh, A., and Palmer, R. G., Introduction To The Theory Of Neural Computation. --: Westview Press, 1991. Cerca nel catalogo