Regret and Jeffreys Integrals in Exp. Families

نویسندگان

  • Peter Grünwald
  • Peter Harremoës
چکیده

where Z is the partition function Z( ) = R exp( x) dQx, and can := f j Z( ) < 1g is the canonical parameter space. We let sup = supf j 2 cang, and inf likewise. The elements of the exponential family are also parametrized by their mean value . We write for the mean value corresponding to the canonical parameter and for the canonical parameter corresponding to the mean value : For any x the maximum likelihood distribution is P x : The Shtarkov integral S is de ned as

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Asymptotically minimax regret for exponential families

We study the problem of data compression, gambling and prediction of a sequence x = x1x2...xn from a certain alphabet X , in terms of regret and redundancy with respect to a general exponential family. In particular, we evaluate the regret of the Bayes mixture density and show that it asymptotically achieves their minimax values when variants of Jeffreys prior are used. Keywords— universal codi...

متن کامل

Statistical Curvature and Stochastic Complexity

We discuss the relationship between the statistical embedding curvature [1, 2] and the logarithmic regret [11] (regret for short) of the Bayesian prediction strategy (or coding strategy) for curved exponential families and Markov models. The regret of a strategy is defined as the difference of the logarithmic loss (code length) incurred by the strategy and that of the best strategy for each dat...

متن کامل

Asymptotically minimax regret by Bayes mixtures - Information Theory, 1998. Proceedings. 1998 IEEE International Symposium on

We study the problem of data compression, gambling and prediction of a sequence zn = z1z2 ... z, from a certain alphabet X , in terms of regret [4] and redundancy with respect to a general exponential family, a general smooth family, and also Markov sources. In particular, we show that variants of Jeffreys mixture asymptotically achieve their minimax values. These results are generalizations of...

متن کامل

Robustly Minimax Codes for Universal Data Compression

We introduce a notion of ‘relative redundancy’ for universal data compression and propose a universal code which asymptotically achieves the minimax value of the relative redundancy. The relative redundancy is a hybrid of redundancy and coding regret (pointwise redundancy), where a class of information sources and a class of codes are assumed. The minimax code for relative redundancy is an exte...

متن کامل

Exchangeability Characterizes Optimality of Sequential Normalized Maximum Likelihood and Bayesian Prediction with Jeffreys Prior

We study online prediction of individual sequences under logarithmic loss with parametric constant experts. The optimal strategy, normalized maximum likelihood (NML), is computationally demanding and requires the length of the game to be known. We consider two simpler strategies: sequential normalized maximum likelihood (SNML), which computes the NML forecasts at each round as if it were the la...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/0903.5399  شماره 

صفحات  -

تاریخ انتشار 2009