Don’t be so Monotone: Relaxing Stochastic Line Search in Over-Parameterized Models
Title
Don’t be so Monotone: Relaxing Stochastic Line Search in Over-Parameterized Models
Speaker
Leonardo Galli - LMU Munich
Abstract
Recent works have shown that line search methods can speed up Stochastic Gradient Descent (SGD) and Adam in modern over-parameterized settings. However, existing line searches may take steps that are smaller than necessary since they require a monotone decrease of the (mini-)batch objective function. We explore nonmonotone line search methods to relax this condition and possibly accept larger step sizes. Despite the lack of a monotonic decrease, we prove the same fast rates of convergence as in the monotone case. Our experiments show that nonmonotone methods improve the speed of convergence and generalization properties of SGD/Adam even beyond the previous monotone line searches. We propose a POlyak NOnmonotone Stochastic (PoNoS) method, obtained by combining a nonmonotone line search with a Polyak initial step size. Furthermore, we develop
a new resetting technique that in the majority of the iterations reduces the amount of backtracks to zero while still maintaining a large initial step size. We conclude by showing that nonmonotone line searches operate at the edge of stability right from the start of the training and consequently yield very flat solutions.
Bio
Leonardo Galli obtained his B.S., M.S. and Ph.D. degrees from University of Florence respectively in 2013, 2016 and 2020. His Ph.D. advisors and mentors there were prof. Marco Sciandrone and prof. Fabio Schoen. To avoid stereotypical comments on Italians, he tried to escape his hometown a few times during his studies (University of Würzburg in 2015, UCLA in 2019 and National Taiwan University in 2020). During these stays, he collaborated with prof. Christian Kanzow on generalized Nash equilibrium problems and with prof. Chih-Jen Lin on truncated Newton methods for linear SVM. In 2021 he moved to RWTH Aachen, where he won a 2 years personal grant from North-Rhine Westphalia. Since then, he collaborates with prof. Holger Rauhut and prof. Mark Schmidt on line search methods for deep learning. He is now senior postdoc at LMU Munich since 2023.
When
Friday November 15th, 14:30
Where
Room 322, UniGe DIBRIS/DIMA, Via Dodecaneso 35