|
| |
Necessary Conditions of L_1-Convergenoe of Kernel Regression Estimators |
| |
Citation: |
Sun Dongchu.Necessary Conditions of L_1-Convergenoe of Kernel Regression Estimators[J].Chinese Annals of Mathematics B,1987,8(4):410~419 |
Page view: 790
Net amount: 783 |
Authors: |
Sun Dongchu; |
|
|
Abstract: |
Let (X_1,Y_1),\cdots,(X_n,Y_n) be iid. and R^d *R-valued samples of (X,Y). The kernel estimator of the regression function m(x)\triangleq E(Y|X=x) (if it exists), with kernel K, is denoted by
$\[{m_n}(x) = \sum\limits_{i = 1}^n {{Y_i}K(\frac{{{X_i} - x}}{{{h_n}}})/\sum\limits_{j = 1}^n {K(\frac{{{X_j} - x}}{{{h_n}}})} } \]$
Many authors discussed the convergence of m_n(x) in various senses, under the conditions h_n\rightarrow 0 and nh_u^d\rightarrow \infinity asn\rightarrow \infinity. Are these conditions necessary? This paper gives an affirmative answer to this bprolemuithe case of L_1-conversence, when K satisfies (1.3) and E(|Y|log^+|Y|)<\infinity. |
Keywords: |
|
Classification: |
|
|
Download PDF Full-Text
|
|
|
|