نتایج جستجو برای: kullback

تعداد نتایج: 7189  

Esmaiel Abounoori, Mohsen Ali Heydari

In this paper we intend to examine the application of Kullback-Leibler, Hellinger and LINEX loss function in Dynamic Linear Model using the real price of oil for 106 years of data from 1913 to 2018 concerning the asymmetric problem in filtering and forecasting. We use DLM form of the basic Hoteling Model under Quadratic loss function, Kullback-Leibler, Hellinger and LINEX trying to address the ...

Journal: :SSRN Electronic Journal 2011

Journal: :Axioms 2017
Dagmar Markechová

The main aim of this contribution is to define the notions of Kullback-Leibler divergence and conditional mutual information in fuzzy probability spaces and to derive the basic properties of the suggested measures. In particular, chain rules for mutual information of fuzzy partitions and for Kullback-Leibler divergence with respect to fuzzy P-measures are established. In addition, a convexity o...

2008
Yuefeng Wu Subhashis Ghosal

Abstract: Positivity of the prior probability of Kullback-Leibler neighborhood around the true density, commonly known as the Kullback-Leibler property, plays a fundamental role in posterior consistency. A popular prior for Bayesian estimation is given by a Dirichlet mixture, where the kernels are chosen depending on the sample space and the class of densities to be estimated. The Kullback-Leib...

The purpose of this paper is to obtain the tracking interval for difference of expected Kullback-Leibler risks of two models under Type II hybrid censoring scheme. This interval helps us to evaluate proposed models in comparison with each other. We drive a statistic which tracks the difference of expected Kullback–Leibler risks between maximum likelihood estimators of the distribution in two diff...

Journal: :ADCAIJ: Advances in Distributed Computing and Artificial Intelligence Journal 2014

Journal: :IEEE Transactions on Information Theory 2014

Journal: :IEEE Trans. Information Theory 2014
Tim van Erven Peter Harremoës

Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends on a parameter that is called its order. In particular, the Rényi divergence of order 1 equals the Kullback-Leibl...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید