Sharp convergence rates for spectral methods via the Feature Space Decomposition method
Title
Sharp convergence rates for spectral methods via the Feature Space Decomposition method
Speaker
Zong Shang - CREST-ENSAE
Abstract
In this talk, I will apply the Feature Space Decomposition (FSD) method developed in my previous work joint with Guillaume Lecué to obtain, under fairly general conditions, matching upper and lower bounds for the population excess risk of spectral methods in linear regression under the squared loss, for every covariance and every signal. This result enables us, for a given linear regression problem, to define a partial order on the set of all spectral methods according to the convergence rates of their population excess risk, thereby characterizing which spectral algorithm is superior for that specific problem. Furthermore, this allows us to generalize the saturation effect proposed in inverse problems and to provide necessary and sufficient conditions for its occurrence. Our method also shows that, under broad conditions, any spectral algorithm lacks a feature learning property, and therefore cannot overcome the barrier of the information exponent in problems such as single-index learning. From a methodological perspective, we merely apply the idea of FSD as a wrapper around the classical analysis of the statistical properties of spectral methods, which yields the above results with virtually no additional effort. This talk is based on joint work with Guillaume Lecué and Zhifan Li at http://arxiv.org/abs/2512.14473.
Bio
I am a third-year PhD student in the Department of Statistics at CREST-ENSAE, Institut Polytechnique de Paris. My PhD advisors are Guillaume Lecué and Matthieu Lerasle. My research focuses on Statistical Learning Theory and its connections to Empirical Process Theory and Geometric Aspects of Functional Analysis. I work on advancing one of the fundamental methodologies in statistical learning theory and mathematical statistics: uniform convergence arguments. This not only provides new tools and theoretical frameworks for modern phenomena such as benign overfitting and feature learning but also offers fresh insights into the predictive properties of classical estimators in mathematical statistics. Starting in August 2026, I will join the School of Mathematics at Georgia Tech as a Visiting Assistant Professor (postdoc), where I will be mentored by Vladimir Koltchinskii.
When
Monday, March 16th, 14:30
Where
Room 322, UniGe DIBRIS/DIMA, Via Dodecaneso 35