نتایج جستجو برای: jeffreys
تعداد نتایج: 514 فیلتر نتایج به سال:
The inequality containing Csiszár divergence on time scales is generalized for 2n2n-convex functions by using Lidstone interpolating polnomial. As an application, new entropic bounds are also computed. Several inequalities in quantum calculus and hh-discrete established. relationship between Shannon entropy, Kullback-Leibler Jeffreys distance with Zipf-Mandelbrot entropy
When observing data x1, . . . , xt modelled by a probabilistic distribution pθ(x), the maximum likelihood (ML) estimator θML = argmaxθ ∑︀t i=1 ln pθ(xi) cannot, in general, safely be used to predict xt+1. For instance, for a Bernoulli process, if only “tails” have been observed so far, the probability of “heads” is estimated to 0. Laplace’s famous “add-one” rule of succession (e.g., [Grü07]) re...
Objective Bayesian inference for the multivariate normal distribution is illustrated, using different types of formal objective priors (Jeffreys, invariant, reference and matching), different modes of inference (Bayesian and frequentist), and different criteria involved in selecting optimal objective priors (ease of computation, frequentist performance, marginalization paradoxes, and decision-t...
Simultaneous predictive distributions for independent Poisson observables are investigated. A class of improper prior distributions for Poisson means is introduced. The Bayesian predictive distributions based on priors from the introduced class are shown to be admissible under the Kullback–Leibler loss. A Bayesian predictive distribution based on a prior in this class dominates the Bayesian pre...
Objective priors for sequential experiments are considered. Common priors, such as the Jeffreys prior and the reference prior, will typically depend on the stopping rule used for the sequential experiment. New expressions for reference priors are obtained in various contexts, and computational issues involving such priors are considered.
In this paper we introduce objective proper prior distributions for hypothesis testing and model selection based on measures of divergence between the competing models; we call them divergence based (DB) priors. DB priors have simple forms and desirable properties, like information (finite sample) consistency; often, they are similar to other existing proposals like the intrinsic priors; moreov...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید