MaLGa logoMaLGa black extendedMaLGa white extendedUniGe ¦ MaLGaUniGe ¦ MaLGaUniversita di Genova | MaLGaUniversita di GenovaUniGe ¦ EcoSystemics
Seminar

Wasserstein Gradient Flow on the Maximum Mean Discrepancy

10/09/2025

Arthur Gretton - [Mento Guancia Sopracciglio]

Title

Wasserstein Gradient Flow on the Maximum Mean Discrepancy


Speaker

Arthur Gretton - University College London, Google DeepMind


Abstract

We construct a Wasserstein gradient flow on the Maximum Mean Discrepancy (MMD): an integral probability metric defined for a reproducing kernel Hilbert space (RKHS), which serves as a metric on probability measures for a sufficiently rich RKHS. This flow transports particles from an initial distribution to a target distribution, where the latter is provided simply as a sample, and can be used to generate new samples from the target distribution. We obtain conditions for convergence of the gradient flow towards a global optimum, and relate this flow to the problem of optimizing neural network parameters. We propose a way to regularize the MMD gradient flow, based on an injection of noise in the gradient, and give theoretical and empirical evidence for this procedure. We provide empirical validation of the MMD gradient flow in the setting of neural network training.


Bio

Arthur Gretton is a Professor with the Gatsby Computational Neuroscience Unit; director of the Centre for Computational Statistics and Machine Learning (CSML) at UCL; and Research Scientist at Google Deepmind. He received degrees in Physics and Systems Engineering from the Australian National University, and a PhD with Microsoft Research and the Signal Processing and Communications Laboratory at the University of Cambridge. He previously worked at the MPI for Biological Cybernetics, and at the Machine Learning Department, Carnegie Mellon University.

Arthur's recent research interests in machine learning include causal inference and representation learning, design and training of generative models (implicit: Wasserstein gradient flows, GANs; and explicit: energy-based models), and nonparametric hypothesis testing. He has been an associate editor at IEEE Transactions on Pattern Analysis and Machine Intelligence from 2009 to 2013, an Action Editor for JMLR since April 2013, an Area Chair for NeurIPS in 2008 and 2009, a Senior Area Chair for NeurIPS in 2018 and 2021, an Area Chair for ICML in 2011 and 2012, a Senior Area Chair for ICML in 2022, a member of the COLT Program Committee in 2013, and a member of Royal Statistical Society Research Section Committee since January 2020. Arthur was program chair for AISTATS in 2016 (with Christian Robert), tutorials chair for ICML 2018 (with Ruslan Salakhutdinov), workshops chair for ICML 2019 (with Honglak Lee), program chair for the Dali workshop in 2019 (with Krikamol Muandet and Shakir Mohammed), and co-organsier of the Machine Learning Summer School 2019 in London (with Marc Deisenroth).


When

Wednesday, September 10th, 16:00


Where

Room 322, UniGe DIBRIS/DIMA, Via Dodecaneso 35