منابع مشابه
Markov Decision Problems Where Means Bound Variances
We identify a rich class of finite-horizon Markov decision problems (MDPs) for which the variance of the optimal total reward can be bounded by a simple affine function of its expected value. The class is characterized by three natural properties: reward boundedness, existence of a do-nothing action, and optimal action monotonicity. These properties are commonly present and typically easy to ch...
متن کاملAsymptotic Bound on Binary Self-Orthogonal Codes
We present two constructions for binary selforthogonal codes. It turns out that our constructions yield a constructive bound on binary self-orthogonal codes. In particular, when the information rate R = 1/2, by our constructive lower bound, the relative minimum distance δ ≈ 0.0595 (for GV bound, δ ≈ 0.110). Moreover, we have proved that the binary selforthogonal codes asymptotically achieve the...
متن کاملAsymptotic approximation of nonparametric regression experiments with unknown variances
Asymptotic equivalence results for nonparametric regression experiments have always assumed that the variances of the observations are known. In practice, however the variance of each observation is generally considered to be an unknown nuisance parameter. We establish an asymptotic approximation to the nonparametric regression experiment when the value of the variance is an additional paramete...
متن کاملAsymptotic variances of QTL estimators with selective DNA pooling.
Investigation on QTL-marker linkage usually requires a great number of observed recombinations, inferred from combined analysis of phenotypes and genotypes. To avoid costly individual genotyping, inferences on QTL position and effects can instead make use of marker allele frequencies. DNA pooling of selected samples makes allele frequency estimation feasible for studies involving large sample s...
متن کاملOn the asymptotic tightness of the Shannon lower bound
New results are proved on the convergence of the Shannon lower bound to the rate distortion function as the distortion decreases to zero. The key convergence result is proved using a fundamental property of informational divergence. As a corollary, it is shown that the Shannon lower bound is asymptotically tight for norm-based distortions, when the source vector has a finite differential entrop...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: The Annals of Mathematical Statistics
سال: 1964
ISSN: 0003-4851
DOI: 10.1214/aoms/1177700378