Chernoff information of exponential families

نویسنده

  • Frank Nielsen
چکیده

Chernoff information upper bounds the probability of error of the optimal Bayesian decision rule for 2-class classification problems. However, it turns out that in practice the Chernoff bound is hard to calculate or even approximate. In statistics, many usual distributions, such as Gaussians, Poissons or frequency histograms called multinomials, can be handled in the unified framework of exponential families. In this note, we prove that the Chernoff information for members of the same exponential family can be either derived analytically in closed form, or efficiently approximated using a simple geodesic bisection optimization technique based on an exact geometric characterization of the “Chernoff point” on the underlying statistical manifold.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A BAYESIAN APPROACH TO SEQUENTIAL SURVEILLANCE IN EXPONENTIAL FAMILIES By

We describe herein a Bayesian change-point model and the associated recursive formulas for the estimated time-varying parameters and the posterior probability that a change-point has occurred at a particular time. The proposed model is a variant of that of Chernoff and Zacks (1964) for the case of normal means with known common variance. It considers more generally the multiparameter exponentia...

متن کامل

Distributed Detection over Random Networks: Large Deviations Performance Analysis

We study the large deviations performance, i.e., the exponential decay rate of the error probability, of distributed detection algorithms over random networks. At each time step k each sensor: 1) averages its decision variable with the neighbors decision variables; and 2) accounts on-the-fly for its new observation. We show that distributed detection exhibits a “phase change” behavior. When the...

متن کامل

CS 174 Lecture 10 John Canny

But we already saw that some random variables (e.g. the number of balls in a bin) fall off exponentially with distance from the mean. So Markov and Chebyshev are very poor bounds for those kinds of random variables. The Chernoff bound applies to a class of random variables and does give exponential fall-off of probability with distance from the mean. The critical condition that’s needed for a C...

متن کامل

Applied Stochastic Processes Problem Set 3

Compute the Chernoff bound on P [X ≥ a] where X is a random variable that satisfies the exponential law f X (x) = λe −λx u(x). Recall, from Equation 4.6-4 on page 214 in [3], that the Chernoff bound for a continuous random variable X is given by P [X ≥ a] ≤ argmin t>0 e −at θ X (t) , (1) where θ X (t) is the moment-generating function θ X (t) E[e tX ] = ∞ −∞ e tx f X (x)dx (2) as defined by Equ...

متن کامل

On large deviations in testing simple hypotheses for locally stationary Gaussian processes

We derive a large deviation result for the log-likelihood ratio for testing simple hypotheses in locally stationaryGaussian processes. This result allows us to find explicitly the rates of exponential decay of the error probabilities of type I and type II for Neyman–Pearson tests. Furthermore, we obtain the analogue of classical results on asymptotic efficiency of tests such as Stein’s lemma an...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1102.2684  شماره 

صفحات  -

تاریخ انتشار 2011