Resource Efficient Machine Learning
This project aims to develop efficient machine learning solutions by blending statistical and algorithmic principles to design scalable learning machines.
The Laboratory for Computational and Statistical Learning (LCSL) is one of the research units within MaLGa, the machine learning center of the University of Genova. LCSL focuses on the development of efficient and reliable machine learning algorithms blending tools from statistics, optimization, and regularization theory.
Our goal is to develop theoretically grounded and practical machine learning algorithms. In collaboration with other MaLGa units and external collaborators we work on foundational aspects of Machine Learning (CHARML) and also a number of applications including vision (MLV), robotics, and biological behavior (PiMLB).
This project aims to develop efficient machine learning solutions by blending statistical and algorithmic principles to design scalable learning machines.
We aim to develop flexible and scalable optimization algorithms for machine learning, emphasizing robustness and adaptivity to tackle diverse and complex data-driven problems effectively.
We tackle the challenge of handling structured data in machine learning, addressing various domains and tasks such as text structure inference, subgraph identification, and time series forecasting.
Annalisa
Barla
Simone
Di Marino
Cesare
Molinari
Lorenzo
Rosasco
Silvia
Villa
Marco
Rando
Edoardo
Caldarelli
Emilia
Magnani
Arnaud
Watusadisi
Ilaria
Stanzani
Marco
Letizia
Hippolyte
Labarrière
Shuo
Huang
Emanuele
Naldi
Pietro
Zerbetto
Joachim
Bona-Pellis…
Lorenzo
Fiorito
Cheik Traoré | 2023 → 2024 | Post-doctoral fellow | Optimization
Stefano Ravera | 2022 → 2024 | Research Scholar | Data handling for large-scale AI applications
Francesco Montagna | 2021 → 2024 | PhD student
Rayan Autones | 2024 | Student
Andrea Della Vecchia | 2022 → 2024 | Post-doctoral fellow | Machine Learning
Elena Milano | 2024 | Research Scholar | Content strategy for ethical-AI communication
Ettore Fincato | 2023 → 2024 | PhD student
Cristian Jesus Vega Cereno | 2021 → 2024 | PhD student | Optimization
Gabriele Bortolai | 2024 | Student
Jonathan Chirinos Rodriguez | 2024 | PhD student
Antoine Chatalic | 2021 → 2024 | Post-doctoral fellow
Rosanna Turrisi | 2023 → 2024 | Post-doctoral fellow | Machine Learning
Elisa Maiettini | 2021 → 2023 | Post-doctoral fellow
Nicolas Schreuder | 2020 → 2023 | Post-doctoral fellow
Vassilis Apidopoulos | 2019 → 2023 | Post-doctoral fellow
Ilaria Stanzani | 2023 | Research Scholar | Machine Learning
Federico Ceola | 2020 → 2023 | PhD student
Andrea Maracani | 2020 → 2023 | PhD student
Cheik Traoré | 2020 → 2023 | PhD student | Optimization
Paolo Didier Alfano | 2022 → 2023 | Research Scholar | Machine Learnin, Robot Vision
Title | Year | Author | Venue |
---|---|---|---|
Iterative regularization for low complexity regularizers | 2024 | Molinari C.; Massias M.; Rosasco L.; Villa S. | NUMERISCHE MATHEMATIK |
SGD vs GD: high noise and rank shrinkage | 2024 | Mengjia Xu Tomer Galanti Akshay Rangamani Lorenzo Rosasco Andrea Pinto Tomaso Poggio | CBMM Memo No. 144 |
Neural reproducing kernel Banach spaces and representer theorems for deep networks | 2024 | F Bartolucci E De Vito L Rosasco S Vigogna | ArXiv Preprint |
A New Formulation for Zeroth-Order Optimization of Adversarial EXEmples in Malware Detection | 2024 | M Rando L Demetrio L Rosasco F Roli | ArXiv Preprint |
Mapping the evolution of design research: a data-driven analysis of interdisciplinary trends and intellectual landscape | 2024 | A Vian G Carella D Pretolesi A Barla F Zurlo | in Gray, C., Ciliotta Chehade, E., Hekkert, P., Forlano, L., Ciuccarelli, P., Lloyd, P. (eds.), DRS2024: Boston, 23–28 June, Boston, USA |
Code accompanying the paper “Structured Prediction for CRiSP Inverse Kinematics Learning with Misspecified Robot Models”
Iterreg is a scikit-learn compatible python package to perform iterative regularization of linear models. It implements the algorithm of “Iterative regularization for convex regularizers” by C. Molinari, M. Massias, L. Rosasco and S. Villa, AISTATS 2021.
Python package implementing the Falkon algorithm for large-scale, approximate kernel ridge regression. The implementation is based on PyTorch and runs on CPU and GPU.
Python code implementing Batch-BKB: the first Bayesian optimization (a.k.a. Gaussian process or bandit optimization) algorithm that is both provably no-regret and guaranteed to run in near-linear time time.