Minimax estimation of the gaussian parametric regression | Vestnik Tomskogo gosudarstvennogo universiteta. Matematika i mekhanika – Tomsk State University Journal of Mathematics and Mechanics. 2014. № 5(31).

Minimax estimation of the gaussian parametric regression

The paper considers the problem of estimating a d>2 dimensional mean vector of a multivariate normal distribution under quadratic loss. Let the observations be described by the equation Y -0+а^, (1) where 0 is a d-dimension vector of unknown parameters from some bounded set 0c8 , £ is a Gaussian random vector with zero mean and identity covariance matrix Id, i.e. .Law(£)=Nd(0, Id) and a is a known positive number. The problem is to construct a minimax estimator of the vector 0 from observations Y. As a measure of the accuracy of estimator 0 we select the quadratic risk defined as R(0,0):- E 0- 0| , |x| -]Tx , j-1 where E 0 is the expectation with respect to measure P 0 . We propose a modification of the James - Stein procedure of the form 0+-I1 I Y |Y|, where c > 0 is a special constant and a+ - max(a,0) is a positive part of a. This estimate allows one to derive an explicit upper bound for the quadratic risk and has a significantly smaller risk than the usual maximum likelihood estimator and the estimator 0*-|1 -.£- |Y |Y|, for the dimensions d > 2. We establish that the proposed procedure 0+ is minimax estimator for the vector 0. A numerical comparison of the quadratic risks of the considered procedures is given. In conclusion it is shown that the proposed minimax estimator 0+ is the best estimator in the mean square sense.

Download file
Counter downloads: 470

Keywords

параметрическая регрессия, улучшенное оценивание, процедура Джеймса - Стейна, среднеквадратический риск, минимаксная оценка, parametric regression, improved estimation, James - Stein procedure, mean squared risk, minimax estimator

Authors

NameOrganizationE-mail
Pchelintsev Valery AnatolyevichTomsk Polytechnic Universityvpchelintsev@vtomske.ru
Pchelintsev Evgeny AnatolyevichTomsk State Universityevgen-pch@yandex.ru
Всего: 2

References

Ибрагимов И.А., Хасьминский Р.З. Асимптотическая теория оценивания. М.: Наука, 1979.
Конев В.В., Пергаменщиков С.М., Пчелинцев Е.А. Оценивание регрессии с шумами импульсного типа по дискретным наблюдениям // ТВП. 2013. V. 58(3). С. 454-471.
Конев В.В., Пчелинцев Е.А. Оценивание параметрической регрессии с импульсными шумами по дискретным наблюдениям // Вестник Томского государственного университета. Математика и механика. 2012. № 1(17). С. 20-35.
Pchelintsev E. Improved estimation in a non-Gaussian parametric regression // Statistical Inference for Stochastic Processes. 2013. V. 16 (1). P. 15-28.
Fourdrinier D., Strawderman W.E., William E. A unified and generalized set of shrinkage bounds on minimax Stein estimates // J. Multivariate Anal. 2008. V. 99. P. 2221-2233.
Пчелинцев Е.А. Процедура Джеймса - Стейна для условно-гауссовской регрессии // Вестник Томского государственного университета. Математика и механика. 2011. № 4(16). С. 6-17.
Gleser L.J. Minimax estimators of a normal mean vector for arbitrary quadratic loss and unknown covariance matrix // The Annals of Statistics. 1986. V. 14. P. 1625-1633.
Berger J.O., Haff L.R. A class of minimax estimators of a normal mean vector for arbitrary quadratic loss and unknown covariance matrix // Statist. Decisions. 1983. No. 1. P. 105-129.
Fourdrinier D., Pergamenshchikov S. Improved selection model method for the regression with dependent noise // Ann. of the Inst. of Statist. Math. 2007. V. 59 (3). P. 435-464.
Shao P.Y.-S., Strawderman W.E. Improving on the James - Stein positive-part estimator // The Annals of Statistics. 1994. V. 22. P. 1517-1538.
Efron B., Morris C. Families of minimax estimators of the mean of a multivariate normal distribution // The Annals of Statistics. 1976. No. 4. P. 11-21.
Guo Y.Y., Pal N. A sequence of improvements over the James - Stein estimator // J. Multivariate Analysis. 1992. V. 42. P. 302-317.
Stein C. Estimation of the mean of a multivariate normal distribution // The Annals of Statistics. 1981. V. 9(6). P. 1135-1151.
Baranchik A.J. Multiple regression and estimation of the mean of a multivariate normal distribution // Technical Report / Department of Statistics, Stanford University. 1964. V. 51.
Strawderman W.E. Proper Bayes minimax estimators of the multivariate normal distribution // Annals of Mathematical Statistics. 1971. V. 42. P. 385-388.
Fourdrinier D. Statistique Inferentielle. Paris: Dunod, 2002.
Lehmann E.L., Casella G. Theory of Point Estimation. 2nd edition. N.Y.: Springer, 1998.
James W., Stein C. Estimation with quadratic loss // Proceedings of the Fourth Berkeley Symposium on Mathematics Statistics and Probability. V. 1. Berkeley: University of California Press, 1961. P. 361-380.
 Minimax estimation of the gaussian parametric regression | Vestnik Tomskogo gosudarstvennogo universiteta. Matematika i mekhanika – Tomsk State University Journal of Mathematics and Mechanics. 2014. № 5(31).

Minimax estimation of the gaussian parametric regression | Vestnik Tomskogo gosudarstvennogo universiteta. Matematika i mekhanika – Tomsk State University Journal of Mathematics and Mechanics. 2014. № 5(31).

Download full-text version
Counter downloads: 1011