Convergence of Gradient Algorithms for Nonconvex C1+α Cost Functions*

Citation:

Zixuan WANG.Convergence of Gradient Algorithms for Nonconvex C1+α Cost Functions*[J].Chinese Annals of Mathematics B,2023,44(3):445~464
Page view: 774        Net amount: 2404

Authors:

Zixuan WANG;

Foundation:

National Natural Science Foundation of China (Nos. 11631004,12031009) and the National Key R&D Program of China (No. 2018YFA0703900).
Abstract: This paper is concerned with convergence of stochastic gradient algorithms with momentum terms in the nonconvex setting. A class of stochastic momentum methods, including stochastic gradient descent, heavy ball and Nesterov’s accelerated gradient,is analyzed in a general framework under mild assumptions. Based on the convergence result of expected gradients, the authors prove the almost sure convergence by a detailed discussion of the effects of momentum and the number of upcrossings. It is worth noting that there are not additional restrictions imposed on the objective function and stepsize.Another improvement over previous results is that the existing Lipschitz condition of the gradient is relaxed into the condition of H¨older continuity. As a byproduct, the authors apply a localization procedure to extend the results to stochastic stepsizes.

Keywords:

Gradient descent methods, Nonconvex optimization, Accelerated gradient descent, Heavy-ball momentum

Classification:

62L20, 90C26
Download PDF Full-Text

主管单位:国家教育部 主办单位:复旦大学 地址:220 Handan Road, Fudan University, Shanghai, China E-mail:edcam@fudan.edu.cn

本系统由北京勤云科技发展有限公司提供技术支持