王建军,徐宗本.神经网络的加权本质逼近阶[J].数学年刊A辑,2009,30(6):741~750
神经网络的加权本质逼近阶
The Essential Order of Approximation with Weights of Neural Networks
Received:April 27, 2009  
DOI:
中文关键词:  逼近估计,神经网络,Jacobi权
英文关键词:Approximation estimation, Neural networks, Jacobi weights
基金项目:
Author NameAffiliationE-mail
WANG Jianjun School of Mathematics and Statistics, Southwest University, Chongqing 400715, China
Institute for Information and ystem Sciences, Xi'an Jiaotong University, Xi'an 710049, China. 
wangjianjun@mail.xjtu.edu.cn 
XU Zongben Institute for Information and System Sciences, Xi'an Jiaotong University, Xi'an 710049, China. zbxu@mail.xjtu.edu.cn 
Hits: 2695
Download times: 1984
中文摘要:
      证明了具有单一隐层的神经网络在Lωq的逼近, 获得了网络逼近的上界估计和下界估计. 这一结果揭示了神经网络在加权逼近的意义下,网络的收敛阶与隐层单元个数之间的关系,为神经网络的应用提供了重要的理论基础.
英文摘要:
      This paper presents the approximation ability of a feedforward neural network with a single hidden layer in Lωq, including the estimation of its approximation upper and lower bounds. Under the principle of the weighted approximation, the work shows the relationship between the approximation precision of an underlying feedforward neural network and the number of hidden nodes. The crucial point provides a theoretical foundation for the applications of feedforward neural networks.
View Full Text  View/Add Comment  Download reader
Close

Organizer:The Ministry of Education of China Sponsor:Fudan University Address:220 Handan Road, Fudan University, Shanghai, China E-mail:edcam@fudan.edu.cn
Designed by Beijing E-Tiller Co.,Ltd.