Estimator selection with respect to Hellinger-type risks

نویسنده

  • YANNICK BARAUD
چکیده

We observe a random measure N and aim at estimating its intensity s. This statistical framework allows to deal simultaneously with the problems of estimating a density, the marginals of a multivariate distribution, the mean of a random vector with nonnegative components and the intensity of a Poisson process. Our estimation strategy is based on estimator selection. Given a family of estimators of s based on the observation of N , we propose a selection rule, based on N as well, in view of selecting among these. Little assumption is made on the collection of estimators. The procedure offers the possibility to perform model selection and also to select among estimators associated to different model selection strategies. Besides, it provides an alternative to the T -estimators as studied recently in Birgé (2006). For illustration, we consider the problems of estimation and (complete) variable selection in various regression settings.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Model selection for Gaussian regression with random design

This paper is about Gaussian regression with random design, where the observations are i.i.d., it is known from Le Cam (1973, 1975 and 1986) that the rate of convergence of optimal estimators is closely connected to the metric structure of the parameter space with respect to the Hellinger distance. In particular, this metric structure essentially determines the risk when the loss function is a ...

متن کامل

Estimating the Intensity of a Random Measure by Histogram Type Estimators

The purpose of this paper is to estimate the intensity of some random measure N on a set X by a piecewise constant function on a finite partition of X . Given a (possibly large) family M of candidate partitions, we build a piecewise constant estimator (histogram) on each of them and then use the data to select one estimator in the family. Choosing the square of a Hellinger-type distance as our ...

متن کامل

Statistical Inference in Sparse High-dimensional Models: Theoretical and computational challenges

In density estimation, we show that our procedure allows to recover (at least, when the number of observations is large enough) the celebrated maximum likelihood estimator when the model is regular enough and contains the true density. When the latter condition is not satisfied, we show that our procedure is robust (with respect to the Hellinger distance) while the maximum likelihood estimator ...

متن کامل

Global Rates of Convergence in Log-concave Density Estimation by Arlene

The estimation of a log-concave density on Rd represents a central problem in the area of nonparametric inference under shape constraints. In this paper, we study the performance of log-concave density estimators with respect to global loss functions, and adopt a minimax approach. We first show that no statistical procedure based on a sample of size n can estimate a log-concave density with res...

متن کامل

Minimum Hellinger Distance Estimation with Inlier Modification

Inference procedures based on the Hellinger distance provide attractive alternatives to likelihood based methods for the statistician. The minimum Hellinger distance estimator has full asymptotic efficiency under the model together with strong robustness properties under model misspecification. However, the Hellinger distance puts too large a weight on the inliers which appears to be the main r...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009