نتایج جستجو برای: minimax estimation

تعداد نتایج: 268563  

2007
D. BROWN

Consider a Gaussian nonparametric regression problem having both an unknown mean function and unknown variance function. This article presents a class of difference-based kernel estimators for the variance function. Optimal convergence rates that are uniform over broad functional classes and bandwidths are fully characterized, and asymptotic normality is also established. We also show that for ...

2015
XUANLONG NGUYEN X. NGUYEN

We establish maximum likelihood and minimax optimal rates of parameter estimation for mean-covariance multivariate Gaussian mixtures, shaperate Gamma mixtures, and some variants of finite mixture models, including the setting where the number of mixing components is bounded but unknown. These models belong to what we call ”weakly identifiable” classes, which exhibit specific interactions among ...

Journal: :J. Multivariate Analysis 2014
Michael Kohler

Given the values of a measurable function m : Rd → R at n arbitrarily chosen points in Rd the problem of estimating m on whole Rd, such that the L1 error (with integration with respect to a fixed but unknown probability measure) of the estimate is small, is considered. Under the assumption that m is (p, C)-smooth (i.e., roughly speaking, m is p-times continuously differentiable) it is shown tha...

2006
Lawrence D. Brown Michael Levine LAWRENCE D. BROWN

Consider a Gaussian nonparametric regression problem having both an unknown mean function and unknown variance function. This article presents a class of difference-based kernel estimators for the variance function. Optimal convergence rates that are uniform over broad functional classes and bandwidths are fully characterized, and asymptotic normality is also established. We also show that for ...

2005
Tong Zhang

We derive upper and lower bounds for some statistical estimation problems. The upper bounds are established for the Gibbs algorithm. The lower bounds, applicable for all statistical estimators, match the obtained upper bounds for various problems. Moreover, our framework can be regarded as a natural generalization of the standard minimax framework, in that we allow the performance of the estima...

2016
James M. Robins Lingling Li Rajarshi Mukherjee Eric Tchetgen Tchetgen Aad van der Vaart

We introduce a new method of estimation of parameters in semiparametric and nonparametric models. The method is based U statistics that are based on higher order influence functions that extend ordinary linear influence functions of the parameter of interest, and represent higher derivatives of this parameter. For parameters for which the representation cannot be perfect the method often leads ...

2008
Aarti Singh Robert D. Nowak Clayton D. Scott

Hausdorff accurate estimation of density level sets is relevant in applications where a spatially uniform mode of convergence is desired to ensure that the estimated set is close to the target set at all points. The minimax optimal rate of error convergence for the Hausdorff metric is known to be (n/ logn) for level sets with Lipschitz boundaries, where the parameter α characterizes the regular...

Journal: :Journal of Machine Learning Research 2016
Xi Chen Adityanand Guntuboyina Yuchen Zhang

This paper provides a general technique for lower bounding the Bayes risk of statistical estimation, applicable to arbitrary loss functions and arbitrary prior distributions. A lower bound on the Bayes risk not only serves as a lower bound on the minimax risk, but also characterizes the fundamental limit of any estimator given the prior knowledge. Our bounds are based on the notion of f -inform...

ABSTRACT. Let R be a commutative noetherian ring, I and J are two ideals of R. Inthis paper we introduce the concept of (I;J)- minimax R- module, and it is shown thatif M is an (I;J)- minimax R- module and t a non-negative integer such that HiI;J(M) is(I;J)- minimax for all i

Journal: :Operations Research 2022

In “Enhanced Balancing of Bias-Variance Tradeoff in Stochastic Estimation: A Minimax Perspective”, the authors study a framework to construct new classes stochastic estimators that can consistently beat existing benchmarks regardless key model parameter values. Oftentimes biased estimators, such as finite-difference black box gradient estimation, require selection tuning parameters balance bias...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید