MaLGa logoMaLGa black extendedMaLGa white extendedUniGe ¦ MaLGaUniGe ¦ MaLGaUniversita di Genova | MaLGaUniversita di Genova

Partners

The school is made possible by the SamPDE Project, funded by a ERC Starting Grant and Next Generation EU, and by the contributions of the Department of Mathematics of the University of Genoa.

At a glance

The school consists of three courses on applied harmonic analysis and machine learning. Graduate students in Mathematics, Physics, Computer Science and Engineering, as well as postdoctoral fellows and young researchers, are welcome.

The school will also feature a one-day workshop with invited speakers and contributed talks/posters. Participants are encouraged to apply.

Limited fundings are available for local accommodation expenses. Priority will be given to PhD students and young researchers.

The school will take place exclusively in person, it will not be streamed online.

Logistics

The school will be held in our beautiful Genova, in the DIMA building in Via Dodecaneso 35, 16146, home of MaLGa Center.


Speakers & instructors

The classes and workshops will be conducted by the following leading experts

Programme

Solving semidefinite programs with low-rank solutions

Instructor: Irène Waldspurger

In the first part of this class, we will describe and motivate our object of study (that is, semidefinite problems with a low-rank solution). In particular, we will see that such problems provide good, and sometimes perfect, convex approximations of difficult non-convex optimization problems.

In the second part, we will discuss how to numerically solve these problems. If we ignore the low rank of the solution, we can use generic semidefinite solvers, but these are oftentimes unconveniently slow. It is therefore desirable to exploit the low rank. We will describe the current approaches. We will notably focus on a family of methods based on the so-called Burer-Monteiro heuristic. These methods have the advantage to significantly reduce the dimension of the semidefinite problem. On the negative side, the problem with reduced dimension is not convex anymore; however, we will see that this non-convexity is a benign issue in many situations.

Infinite graphs, neural networks and applications to imaging

Instructors: Alberto Setti, Davide Bianchi

Inverse problems in imaging arise in various tasks, including denoising, deblurring, inpainting, super-resolution, and computed tomography, just to cite some of the most famous and studied. These problems are typically ill-posed, meaning that even minor (and inevitable) perturbations in the observed data can result in a highly inaccurate approximation of the true solution if not handled with care. Hence, regularization methods capable of addressing the ill-posed nature of such problems are essential.

During this short course, we will explore how a combination of variational methods, the graph Laplacian, and modern Deep Neural Networks (DNNs) can significantly improve the quality of the approximated solutions of some of the aforementioned problems. In particular, a significant part of the course will be dedicated to the properties of the graph Laplacian on infinite weighted graph, which can put a leash on the powerful but unstable nature of the DNNs, bending all their power to create a very accurate and robust regularization method.

Signal models and sampling theorems

Instructor: Karlheinz Gröchenig

The typical problem of data analysis is to fit data \$(x_j,y_j)\$ by a function \$f\$ such that \$f\$ approximates the value \$y_j\$ at the point \$x_j\$. Often we have addition information about \$f\$ through an underlying signal model. The standard assumption in signal processing in the 20th century was the assumption that every signal is bandlimited, but nowadays there are many alternative models in use.

In the short course we will study various signal models (bandlimited functions, shift-invariant spaces, functions of variable bandwidth, abstract versions of bandwidth) and address some of the mathematical questions:

(i) What is possible? Is there a Nyquist rate or a necessary condition for sampling and reconstruction?

(ii) Sufficient conditions: under which conditions can a function be reconstructed or approximation in the context of a signal model?

(iii) When can data be interpolated in the context of a given signal model?

Workshop on Wednesday sept. 4th

The one-day workshop will feature our two invited speakers, Philipp Grohs and Michael Unser, and contributed talks/posters by our participants (TBA)

How to register

Please fill in the form below to register.

Deadline for application for financial support: June 30th.
Deadline for contributed talks/posters: June 30th.

Registration Deadline

Wednesday, July 31, 2024


Registration Fees

There is no registration fee, but participants are required to register before the deadline (July 31st).