Admissibility and minimaxity of generalized Bayes estimators for spherically symmetric family
نویسندگان
چکیده
Abstract: We give a sufficient condition for admissibility of generalized Bayes estimators of the location vector of spherically symmetric distribution under squared error loss. Compared to the known results for the multivariate normal case, our sufficient condition is very tight and is close to being a necessary condition. In particular we establish the admissibility of generalized Bayes estimators with respect to the harmonic prior and priors with slightly heavier tail than the harmonic prior. We use the theory of regularly varying functions to construct a sequence of smooth proper priors approaching an improper prior fast enough for establishing the admissibility. We also discuss conditions of minimaxity of the generalized Bayes estimator with respect to the harmonic prior.
منابع مشابه
An admissibility proof using an adaptive sequence of smoother proper priors approaching the target improper prior
We give a sufficient condition for the admissibility of generalized Bayes estimators of the location vector of spherically symmetric distribution under squared error loss. Compared to the known results for the multivariate normal case, our sufficient condition is very tight and is close to being necessary. In particular we establish the admissibility of generalized Bayes estimators with respect...
متن کاملA unified approach to non-minimaxity of sets of linear combinations of restricted location estimators
Discussion Papers are a series of manuscripts in their draft form. They are not intended for circulation or distribution except as indicated by the author. For that reason Discussion Papers may not be reproduced or distributed without the written consent of the author. This paper studies minimaxity of estimators of a set of linear combinations of location parameters µ i , i = 1,. .. , k under q...
متن کاملA new class of generalized Bayes minimax ridge regression estimators
Let y = Aβ + ε, where y is an N × 1 vector of observations, β is a p× 1 vector of unknown regression coefficients, A is an N × p design matrix and ε is a spherically symmetric error term with unknown scale parameter σ. We consider estimation of β under general quadratic loss functions, and, in particular, extend the work of Strawderman [J. Amer. Statist. Assoc. 73 (1978) 623–627] and Casella [A...
متن کاملAn extended class of minimax generalized Bayes estimators of regression coefficients
We derive minimax generalized Bayes estimators of regression coefficients in the general linear model with spherically symmetric errors under invariant quadratic loss for the case of unknown scale. The class of estimators generalizes the class considered in Maruyama and Strawderman (2005) to include non-monotone shrinkage functions. AMS subject classification: Primary 62C20, secondary 62J07
متن کاملOn the Bayesness, minimaxity and admissibility of point estimators of allelic frequencies.
In this paper, decision theory was used to derive Bayes and minimax decision rules to estimate allelic frequencies and to explore their admissibility. Decision rules with uniformly smallest risk usually do not exist and one approach to solve this problem is to use the Bayes principle and the minimax principle to find decision rules satisfying some general optimality criterion based on their ris...
متن کامل