Resource Efficient Machine Learning
This project aims to develop efficient machine learning solutions by blending statistical and algorithmic principles to design scalable learning machines.
The Laboratory for Computational and Statistical Learning (LCSL) is one of the research units within MaLGa, the machine learning center of the University of Genova. LCSL focuses on the development of efficient and reliable machine learning algorithms blending tools from statistics, optimization, and regularization theory.
Our goal is to develop theoretically grounded and practical machine learning algorithms. In collaboration with other MaLGa units and external collaborators we work on foundational aspects of Machine Learning (CHARML) and also a number of applications including vision (MLV), robotics, and biological behavior (PiMLB).
This project aims to develop efficient machine learning solutions by blending statistical and algorithmic principles to design scalable learning machines.
We aim to develop flexible and scalable optimization algorithms for machine learning, emphasizing robustness and adaptivity to tackle diverse and complex data-driven problems effectively.
We tackle the challenge of handling structured data in machine learning, addressing various domains and tasks such as text structure inference, subgraph identification, and time series forecasting.
Annalisa
Barla
Simone
Di Marino
Cesare
Molinari
Lorenzo
Rosasco
Silvia
Villa
Edoardo
Caldarelli
Arnaud
Watusadisi
Ilaria
Stanzani
Marco
Letizia
Hippolyte
Labarrière
Shuo
Huang
Pietro
Zerbetto
Joachim
Bona-Pellis…
Lorenzo
Fiorito
Oleksii
Kachaiev
Giordano
Vitale
Luc
Brogat-Motte
Luca
Piccaluga
Marco Rando | 2023 → 2025 | Post-doctoral fellow
Emanuele Naldi | 2023 → 2025 | Post-doctoral fellow | Optimization in infinite dimensional spaces
Francesco Montagna | 2021 → 2025 | PhD student
Emilia Magnani | 2024 | PhD student
Cheik Traoré | 2023 → 2024 | Post-doctoral fellow | Optimization
Stefano Ravera | 2022 → 2024 | Research Scholar | Data handling for large-scale AI applications
Rayan Autones | 2024 | Student
Andrea Della Vecchia | 2022 → 2024 | Post-doctoral fellow | Machine Learning
Elena Milano | 2024 | Research Scholar | Content strategy for ethical-AI communication
Ettore Fincato | 2023 → 2024 | PhD student
Cristian Jesus Vega Cereno | 2021 → 2024 | PhD student | Optimization
Gabriele Bortolai | 2024 | Student | Machine Learning and Vision
Jonathan Chirinos Rodriguez | 2024 | PhD student
Antoine Chatalic | 2021 → 2024 | Post-doctoral fellow
Rosanna Turrisi | 2023 → 2024 | Post-doctoral fellow | Machine Learning
Elisa Maiettini | 2021 → 2023 | Post-doctoral fellow
Nicolas Schreuder | 2020 → 2023 | Post-doctoral fellow
Vassilis Apidopoulos | 2019 → 2023 | Post-doctoral fellow
Federico Ceola | 2020 → 2023 | PhD student
Andrea Maracani | 2020 → 2023 | PhD student
| Title | Year | Author | Venue | 
|---|---|---|---|
| Artificial intelligence and network science as tools to illustrate academic research evolution in interdisciplinary fields: The case of Italian design | 2025 | Pretolesi Daniele; Stanzani Ilaria; Ravera Stefano; Vian Andrea; Barla Annalisa | PLOS ONE | 
| Linear quadratic control of nonlinear systems with Koopman operator learning and the Nyström method | 2025 | Caldarelli E.; Chatalic A.; Colome A.; Molinari C.; Ocampo-Martinez C.; Torras C.; Rosasco L. | AUTOMATICA | 
| Grand-Canonical Optimal Transport | 2025 | DI MARINO Simone; Lewin Mathieu; Nenna Luca | ARCHIVE FOR RATIONAL MECHANICS AND ANALYSIS | 
| Representation theorems for normed modules | 2025 | Di Marino S.; Lucic D.; Pasqualetto E. | REVISTA DE LA REAL ACADEMIA DE CIENCIAS EXACTAS, FÍSICAS Y NATURALES. SERIE A, MATEMÁTICAS | 
| Be greedy and learn: Efficient and certified algorithms for parametrized optimal control problems | 2025 | Molinari Cesare; Kleikamp H.; Lazar Martin | ESAIM. MATHEMATICAL MODELLING AND NUMERICAL ANALYSIS | 
Code accompanying the paper “Structured Prediction for CRiSP Inverse Kinematics Learning with Misspecified Robot Models”
Iterreg is a scikit-learn compatible python package to perform iterative regularization of linear models. It implements the algorithm of “Iterative regularization for convex regularizers” by C. Molinari, M. Massias, L. Rosasco and S. Villa, AISTATS 2021.
Python package implementing the Falkon algorithm for large-scale, approximate kernel ridge regression. The implementation is based on PyTorch and runs on CPU and GPU.
Python code implementing Batch-BKB: the first Bayesian optimization (a.k.a. Gaussian process or bandit optimization) algorithm that is both provably no-regret and guaranteed to run in near-linear time time.