Revisiting optimization: gradient descent with a general cost
Title
Revisiting optimization: gradient descent with a general cost
Speaker
Pierre-Cyril Aubin-Frankowski - INRIA Paris (Sierra)
Abstract
We present a new class of gradient-type optimization methods that extends vanilla gradient descent, mirror descent, Riemannian gradient descent, and natural gradient descent. Our approach involves constructing a surrogate for the objective function in a systematic manner, based on a chosen cost function. This surrogate is then minimized using an alternating minimization scheme. Using optimal transport theory we establish convergence rates based on generalized notions of smoothness and convexity. We provide local versions of these two notions when the cost satisfies a condition known as nonnegative cross-curvature in optimal transport. In particular our framework provides the first global rates for natural gradient descent and Newton's method. This is a joint work with Flavien Léger (INRIA).
Bio
Pierre-Cyril Aubin is a postdoctoral researcher at INRIA Paris (SIERRA team) with Alessandro Rudi. He obtained his PhD in 2021 on estimation and control under constraints through kernel methods from MINES ParisTech. His research is broadly centered on the links between kernel methods, optimal control, Kalman filtering and optimization in measure spaces, with a recent interest for max-plus spaces and c-transforms.
When
May 12th 2023, 14:30
Where
Room 322, UniGe DIBRIS/DIMA, Via Dodecaneso 35