Change Browser!
Change Browser
The James – Stein procedure for a conditionally gaussian regression
The paper considers the problem of estimating a p-dimensional (p ≥ 2) meanvector of a multivariate conditionally normal distribution under quadratic loss. The problem ofthis type arises when estimating the parameters in a continuous time regression model with a non-Gaussian Ornstein-Uhlenbeck process. We propose a modification of the James-Stein procedureof the form θ*(Y) = (1 - c/||Y||) Y, where Y is an observation and c > 0 is a special constant. Thisestimate allows one to derive an explicit upper bound for the quadratic risk and has a significantlysmaller risk than the usual maximum likelihood estimator for the dimensions p≥2. This procedureis applied to the problem of parametric estimation in a continuous time conditionally Gaussian regressionmodel and to that of estimating the mean vector of a multivariate normal distributionwhen the covariance matrix is unknown and depends on some nuisance parameters.
Keywords
non-Gaussian Ornstein - Uhlenbeck process,
James - Stein procedure,
improved estimation,
conditionally Gaussian regression model,
негауссовский процесс Орнштейна - Уленбека,
процедура Джеймса - Стейна,
улучшенное оценивание,
условно-гауссовская регрессияAuthors
Pchelintsev Evgenii Anatolevich | National Research Tomsk State University, Universite de Rouen (France) | evgen-pch@yandex.ru |
Всего: 1
References
Stein C. Estimation of the mean of a multivariate normal distribution // Ann. Statist. 1981. V. 9(6). P. 1135−1151.
Fourdrinier D., Strawderman W.E., William E. A unified and generalized set of shrinkage bounds on minimax Stein estimates // J. Multivariate Anal. 2008. V. 99. P. 2221−2233.
Gleser L.J. Minimax estimators of a normal mean vector for arbitrary quadratic loss and unknown covariance matrix // Ann. Statist. 1986. V. 14. No. 1625−1633.
James W., Stein C. Estimation with quadratic loss // Proceedings of the Fourth Berkeley Symposium on Mathematics Statistics and Probability. V. 1. Berkeley: University of California Press, 1961. P. 361−380.
Konev V., Pergamenchtchikov S. Efficient robust nonparametric estimation in a semimartingale regression model. URL: http://hal.archives-ouvertes.fr/hal-00526915/fr/ (2010).
Fourdrinier D. Statistique inferentielle // D. Fourdrinier. Dunod. 2002. P. 336.
Fourdrinier D., Pergamenshchikov S. Improved selection model method for the regression with dependent noise // Ann. Inst. Statist. Math. 2007. V. 59 (3). P. 435−464.
Berger J.O., Haff L.R. A class of minimax estimators of a normal mean vector for arbitrary quadratic loss and unknown covariance matrix // Statist. Decisions. 1983. No. 1. P. 105−129.
Efron B., Morris C. Families of minimax estimators of the mean of a multivariate normal distribution // Ann. Statist. 1976. No. 4. P. 11−21.
The James – Stein procedure for a conditionally gaussian regression | Vestnik Tomskogo gosudarstvennogo universiteta. Matematika i mekhanika – Tomsk State University Journal of Mathematics and Mechanics. 2011. № 4(16).
Download file