Nonparametric Estimation for an AutoregressiveModel | Вестник Томского государственного университета. Математика и механика. 2008. № 2 (3).

Nonparametric Estimation for an AutoregressiveModel

The paper deals with the nonparametric estimation problem at a given fixed point for an autoregressive model with unknown distributed noise. Kernel estimate modifications are proposed. Asymptotic minimax and efficiency properties for proposed estimators are shown.

Nonparametric Estimation for an AutoregressiveModel .pdf 1. IntroductionWe consider the following nonparametric autoregressive modelyk = S(xk)yk_i + qk , 1< k < n, (.1)where S(.) is an unknown R -- R function, xk = k/n, y0 is a constant and the noiserandom variables (qk )l 0,lim кп = 0 and lim 4 = 0 . (1.5)As to the the kernel function we assume thatJ_1Q(z)dz > 0 and J-1 zQ(z)dz = 0 . (1.6)In this paper we show that the estimator (1.2) with the parameters (1.4)-(1.6) is asymptotically minimax, i.e. we show that the asymptotical upper bound for the minimax risk with respect to the stable local Holder class is finite.At the next step we study sharp asymptotic properties for the minimax estimators(1.2).To this end similarly to [1] we introduce the weak stable local Holder class. In this case we find a positive constant giving the exact asymptotic lower bound for the minimax risk with the normalyzing coefficient (1.3). Moreover, we show that for the estimator (1.2) with the parameters (1.4)-(1.5) and the indicator kernel Q = 1[ the asymptotic upper bound of the minimax risk coincides with this constant, i.e. in this case such estimators are asymptotically efficient. In [9], Belitser consider the above model with lipshitz condtions.The autor proposed a recursive estimator , and consider the estimatimation problem in a fixed t. By the quadratic risk, Belitser establish the convergence rate witout showing it's optimality. Moulines at al in [10], show that the convergence rate is optimal for the quadratic risk by using a recursive method for autoregressive model of order d. We note that in our paper we establish an optimal convergence rate but the risk considered is different from the one used in [10], and assymptions are weaker then those of [10].The paper is organized as follows. In the next section we give the main results. In Section 3 we find asymptotical lowers bounds for the minimax risks. Section 4 is devoted to uppers bounds. Appendix contains some technical results.2. Main resultsFisrt of all we assume that the noise in the model (1.1), i.e. the i.i.d. random variables (qk )1 3. Note that the (0,1) -gaussian density belongs to P. In the sequel we denote this density by _p0.The problem is to estimate the function S(.)at a fixed point z0 e (0,1), i.e. the value S(z0). For this problem we make use of the risk proposed in [1]. Namely, for any estimate S = Sn (z0) (i.e. any mesurable with respect to the observations (yk \ 0 , (2.5)S SeH(e) (z0 ,K,e)where the infimum is taken over all estimators.Now we obtain an upper bound for the kernel estimator (1.2)Theorem 2.2. For any K > 0 and 0 < б < 1 the kernel estimator (1.2) with the parameters (1.4) - (1.6) satisfies the following inequalitylim sup Фя3?я (S„, S) 0 we setU6Яя (z0,6) = {S еГе : IS\ < б-1 and |Ой (z0, S)| < 5ДЯ}, (2.7)whereQh (z0 > S) = ii (S(zo + uh) ~ S(zo ))duand h is given in (1.4). Moreover, we sett(S ) = 1 - S2 (z0). (2.8)With the help of this function we describe the sharp lower bound for the minimax risks in this case.Theorem 2.3. For any 8 > 0 and 0 < s < 1lim„^ inf sup т (S)cp„3?„ (Sn, S) > e\ц\, (2.9)where n is a gaussian random variable with the parameters (0,1/2).Theorem 2.4. The estimator (1.2) with the parameters (1.4) - (1.5) and Q(z) = 1[_11]satisfies the following inequalitylimiim sup т-1/2 (s)Фя^я (sn, s) < ehi,where r is a gaussian random variable with the parameters (0,1/2).Theorems 2.3 and 2.4 imply that the estimator (1.2), (1.4) - (1.5) with the indicator kernel is asymptotically efficient.Remark 2.1. One can show (see [1]) that for any 0 < 8 < 1 and n > 1H(W (z0,5,6) с U«»(zo, 6).This means that the «natural» normalyzing coefficient for the functional class (2.7) is the sequence (1.3). Theorem 2.3 and Theorem 2.4 extend usual the Holder approach for the point estimation by keeping the minimax convergence rate (1.3).3. Lower bounds3.1. Proof of Theorem 2.1 Note that to prove (2.5) it suffices to show thatliminf sup Es,ftу„(S„,S) > 0 , (3.1)S SeH(Я) (z0K,e)whereVn (S„, S) = Фя \S„ (zo) - S(zo ).We make use of the similar method proposed by Ibragimov and Hasminskii to obtain a lower bound for the density estimation problem in [7]. First we chose the corresponding parametric family in H(e) (z0, K, s). Let V be a two times continuously differ-entiable function such that J_1 V(z)dz > 0 and V(z) = 0 for any |z| > 1 We setSB(x) = (3.2)Ф„ V h )where фи and h are defined in (1.3) and (1.4). It is easy to see that for any z0 - h < x < z0 + h\u|S„ (x) - Su (z0 )| = J^L J у {fZ^j- у (о) < ^ V*fcpj Ih,v „I 1I I II 01where V = maxjz|s1|V(z)| . Therefore, for all 0 < u < u = KJV* we obtain that\Su (x) - (zo )| sup J< K.Z0 -h nKz. Therefore, for all n > nKz and for any estimator Sn we estimate with below the supremum in (3.1) assup Es,pVn (Sn,S) ^ sup e5u,Po уя (S?„, Su) J*bе5ц,Poу„ (Sn, 5„ )^и (3.3)SeH (e) ( z0, K ,e)|u| po , where p0jpo is the distribution of the vector (y,--,yn) in (1.1) corresponding to the function S = 0 and the gaussian (0,1) noise densityp0, i.e. the random variables (y^---,yn) are i.i.d. N(0,1) with respect to the measure p0>p . In the sequel we denote p0>p by P. It is easy to see that in this case the Radon-Nikodym derivative can be written asPn (M)=^jpfL=* i1 n1 nwith?2 = - Z V2 (Uk _ and Лп =Z V(uk ^k-lSi .Фп k=1Фп Sn k=1Through the large numbers law we obtainp - lim q2n = lim -1 £ V2 (к* ^ = Д V2 (u)du = a2 ,я->сюи->сю ИИ k=k»wherek» = [nz0 - nh] +1 and k* = [nz0 + nh]. (3.4) Here [a] is the integer part of a.Moreover, by the central limit theorem for martingales (see [2] and [3]), it is easy to see that under the measure Pцп == N(0,1) as n - да.Therefore we represent the Radon-Nykodim density in the following asymptotic form2 2 u ОUOTi» -2Pn(«)=ewherep - lim r„ = 0.This means that in this case the Radon-Nikodym density (pn (u))n>l satisfies theИт^оо inf - J*j е5ц,po у„ (Sn, Su )du > I{b, a), (3.5)L.A.N. property and we can make use the method from theorem 12.1 of [7] to obtain the following inequality1 ф s 2bmax(1,b-Vb) a Jb -а2у , whereI (b, a) =-j= J_^ e 2 «иand 0 < b < u*. Therefore, inequalities (3.3) and (3.4) imply (3.1). Hence Theorem 2.1. ■3.2 Proof of Theorem 2.3First, similarly to the proof of Theorem 2.1 we choose the corresponding parametric functional family Su v (.) in the form (3.2) with the function V = Vv defined asV(x) = v-1 £Яv(u)g(^]du .Where Яv (И) = 1{|„| oouniformly in S еГе and p e f. Therefore, by Lemma A.2 we obtain that uniformly inS еГЁ and p e fiT1/2 (S)Д, Zn => N(0,12) as n oMoreover, by applying the Burkhцlder inequality and Lemma A.2 to the martingale Z„ we deduce thatlim sup sup espZn \ is uniformly inte-grable. This means thatlim sup sup It-1/2 (S)Es,p 1д, ь - e| П = 0,SeH(Я) (z0,K,e) PePwhere n is a gaussian random variable with the parameters (0,1/2). Now to finish this proof we have to show thatlim lim sup sup ESpB„ = 0. (4.4)Indeed, by setting fS (и) = S(z0 + hь) - S(z0) we rewrite Bn asBn = - Z fs(uk)yli =Ф„ Gn (fs,S)+ ^-QA(zo,S), (4.5) Ф„ k=k*t(s)u^

Ключевые слова

nonparametric autoregression , minimax , kernel estimates , nonparametric autoregression , asymptotical efficiency , minimax , kernel estimates , asymptotical efficiency

Авторы

ФИООрганизацияДополнительноE-mail
Arkoun Ouerdia Universitéde Rouen Phd, Laboratoire de Mathématiques Raphaёl Salem, UMR 6085 CNRS Ouerdia.Arkoun@etu.univ-rouen.fr
Pergamenchtchikov Sergei Université de Rouen (France) Professsor, Laboratoire de Mathématiques Raphaёl Salem Serge.Pergamenchtchikov@univ-rouen.fr
Всего: 2

Ссылки

Moulines et al. On recursive estimation for time varying autoregressive processes // The Annals of Statistics. 2005. V. 33. No. 6. P. 2610 - 2654.
Shiryaev A.N. Probability. Second Edition. Springer, 1992.
Ibragimov I.A. and Hasminskii R.Z. Statistical Estimation: Asymptotic Theory. Berlin, New York: Springer, 1981.
Belitser E. Local minimax pointwise estimation of a multivariate density // Statisti. Nederlandica. 2000. V. 54. No. 3. P. 351 - 365.
Belitser E. Recursive estimation of a drifted autoregressive parameter // The annals of Statistics. 2000. V. 26. No. 3. P. 860 - 870.
Dahlhaus R. Maximum likelihood estimation and model selection for locally stationary processes // J. Nonparametr. Statist. 1996. V.6. No. 2 - 3. P. 171 - 191.
Helland, Inge S. Central limit theorems for martingales with discrete or continuous time. Scand. J. Statist. 1982. V. 9. No. 2. P. 79 - 94.
Rebolledo R. Central limit theorems for local martingales // Z. Wahrsch. Verw. Gebiete. 1980. V. 51. No. 3. P. 269 - 286.
Dahlhaus R. On the Kullback-Leibler information divergence of locally stationary processes // Stochastic Process. Appl. 1996. V. 62. No. 1. P. 139 - 168.
Galtchouk L., Pergamenchtchikov S. Asymptotically efficient estimates for nonparametric regression models // Statist. Probab. Lett. 2006. V. 76. No. 8. P. 852 - 860.
 Nonparametric Estimation for an AutoregressiveModel             | Вестник Томского государственного университета. Математика и механика. 2008. № 2 (3).

Nonparametric Estimation for an AutoregressiveModel | Вестник Томского государственного университета. Математика и механика. 2008. № 2 (3).

Полнотекстовая версия