Gradient Convergence of Deep Learning-Based Numerical Methods for BSDEs*

Citation:

Zixuan WANG,Shanjian TANG.Gradient Convergence of Deep Learning-Based Numerical Methods for BSDEs*[J].Chinese Annals of Mathematics B,2021,42(2):199~216
Page view: 396        Net amount: 357

Authors:

Zixuan WANG; Shanjian TANG

Foundation:

National Key R&D Program of China (No. 2018YFA0703900) and the National Natural Science Foundation of China (No. 11631004).
Abstract: The authors prove the gradient convergence of the deep learning-based numerical method for high dimensional parabolic partial differential equations and backward stochastic differential equations, which is based on time discretization of stochastic differential equations (SDEs for short) and the stochastic approximation method for nonconvex stochastic programming problem. They take the stochastic gradient decent method,quadratic loss function, and sigmoid activation function in the setting of the neural network. Combining classical techniques of randomized stochastic gradients, Euler scheme for SDEs, and convergence of neural networks, they obtain the O(K? 1/4 ) rate of gradient convergence with K being the total number of iterative steps.

Keywords:

PDEs, BSDEs, Deep learning, Nonconvex stochastic programming,Convergence result

Classification:

62L20,90C26
Download PDF Full-Text

主管单位:国家教育部 主办单位:复旦大学 地址:220 Handan Road, Fudan University, Shanghai, China E-mail:edcam@fudan.edu.cn

本系统由北京勤云科技发展有限公司提供技术支持