Regularization Methods for Machine Learning 2021
At a glance
UniGe | MaLGa & DIBRIS - firstname.lastname@example.org
From Jun 21 2021 to Jun 25 2021
Live streaming on Microsoft Teams
Application deadline: May 16
Notifications of acceptance have been sent out
The school will be held online on Microsoft Teams
We sent the invitations to join RegML on Microsoft Teams
Certificates of attendance have been sent out
Understanding how intelligence works and how it can be emulated by machines is an age old dream and arguably one of the biggest challenges in modern science. Learning, with its principles and computational implementations, is at the very core of this endeavor.
Recently, for the first time, we have been able to develop artificial intelligence systems able to solve complex tasks considered out of reach for decades.
Modern cameras recognize faces, and smart phones voice commands, cars can see and detect pedestrians and ATM machines automatically read checks.
In most cases at the root of these success stories there are machine learning algorithms, that is, software that is trained rather than programmed to solve a task.
Among the variety of approaches to modern computational learning, we focus on regularization techniques, that are key to high-dimensional learning.
Regularization methods allow to treat in a unified way a huge class of diverse approaches, while providing tools to design new ones. Starting from classical notions of smoothness, shrinkage and margin, the course will cover state of the art techniques based on the concepts of geometry (aka manifold learning), sparsity and a variety of algorithms for supervised learning, feature selection, structured prediction, multitask learning and model selection. Practical applications for high dimensional problems, in particular in computational vision, will be discussed.
The classes will focus on algorithmic and methodological aspects, while trying to give an idea of the underlying theoretical underpinnings. Practical laboratory sessions will give the opportunity to have hands-on experience.
A certificate of attendance (2 credits suggested according to the ECTS grading scale) will be sent to all participants.
An exam certificate (no grade, 6 credits suggested according to the ECTS grading scale) will be issued to those who will take and pass the exam
RegML is a 20 hours advanced machine learning course including theory classes and practical laboratory sessions. The course covers foundations as well as recent advances in Machine Learning with emphasis on high dimensional data and a core set techniques, namely regularization methods. In many respects the course is a compressed version of the 9.520 course at MIT.
Note: all times are in CEST / GMT+2.
- Mon - 9.30-11.00 - Class 1: - Introduction to Statistical Machine Learning
2. Mon - 11.00-13.00 - Class 2: - Tikhonov Regularization and Kernels
3. Mon - 14.00-16.00 - Lab 1: - Binary classification and model selection
4. Tue - 9.30-11.00 - Class 3: - Early Stopping and Spectral Regularization
5. Tue - 11.00-13.00 - Class 4: - Regularization for Multi-task Learning
6. Tue - 14.00-16.00 - Lab 2: - Spectral filters and multi-class classification
7. Wed - 9.30 Workshop - Carlo Ciliberto - A (Biased) Introduction to Meta-Learning
7. Thu - 9.30-11.00 - Class 5: - Sparsity Based Regularization
8. Thu - 11.00-13.00 - Class 6: - Structured Sparsity
9. Thu - 14.00-16.00 - Lab 3: - Sparsity-based learning
10. Fri - 9.30-11.00 - Data Representation: Dictionary Learning
11. Fri - 11.00-13.00 - Data Representation: Deep Learning
Associate Professor in Machine Learning
University College London
A (Biased) Introduction to Meta-Learning
Once accepted, each candidate has to follow the instructions in the acceptance email and proceed with the payment. The registration fee is non-refundable.
students and postdocs: waived
professionals: EUR 150
UniGe students and IIT affiliates: no fee
To apply, complete the application form
UniGe | MaLGa & DIBRIS - email@example.com
UniGe | MaLGa & DIBRIS - firstname.lastname@example.org
MLCC 2020. A one week (crash) course of 10 lectures, including theoretical and practical sessions
MIT 9.520 Statistical Learning Theory and Applications. This is a term long course of roughly
25 lectures offered to graduate students at MIT
Machine Learning 2018/2019. Undergraduate term-long introductory Machine Learning course offered at the University of Genova
CBMM Summer School: Machine Learning Classes. One day introduction to the essential concepts and algorithms at the core of modern Machine Learning
RegML master page. Previous editions of RegML. The course started in 2008 has seen an increasing national and international attendance over the years, with a peak of over 90 participants in 2014