A preconditioned second-order convex splitting algorithm with a difference of varying convex functions and line search
Title
A preconditioned second-order convex splitting algorithm with a difference of varying convex functions and line search
Speaker
Hongpeng Sun - Renmin University of China
Abstract
This paper introduces a preconditioned convex splitting algorithm enhanced with line search techniques for nonconvex optimization problems. The algorithm utilizes second-order backward differentiation formulas (BDF) for the implicit and linear components and the Adams-Bashforth scheme for the nonlinear and explicit parts of the gradient flow in variational functions. The proposed algorithm, resembling a generalized difference-of-convex function approach, involves a changing set of convex functions in each iteration. It integrates the Armijo line search strategy to improve performance. The study also discusses classical preconditioners such as symmetric Gauss-Seidel, Jacobi, and Richardson within this context. The global convergence of the algorithm is established through the Kurdyka-Łojasiewicz properties, ensuring convergence within a finite number of preconditioned iterations. Numerical experiments demonstrate its high efficiency. Besides, we will also discuss another extrapolation framework which is different from line search. This talk is based on joint work with Xinhua Shen and Zaijiu Shang.
When
Thursday October 23rd, 2pm
Where
Room 715, UniGe DIBRIS/DIMA, Via Dodecaneso 35