نتایج جستجو برای: kullback leibler

تعداد نتایج: 7228  

Anis Iranmanesh, M. Arashi, S. M. M. Tabatabaey,

In this paper, by conditioning on the matrix variate normal distribution (MVND) the construction of the matrix t-type family is considered, thus providing a new perspective of this family. Some important statistical characteristics are given. The presented t-type family is an extension to the work of Dickey [8]. A Bayes estimator for the column covariance matrix &Sigma of MVND is derived under ...

2006
Piotr Garbaczewski P. Garbaczewski

We analyze a contrasting dynamical behavior of Gibbs–Shannon and conditional Kullback-Leibler entropies, induced by time-evolution of continuous probability distributions. The question of predominantly purposedependent entropy definition for non-equilibriummodel systems is addressed. The conditional Kullback–Leibler entropy is often believed to properly capture physical features of an asymptoti...

Journal: :Entropy 2005
Piotr Garbaczewski

We give a detailed analysis of the Gibbs-type entropy notion and its dynamical behavior in case of time-dependent continuous probability distributions of varied origins: related to classical and quantum systems. The purpose-dependent usage of conditional Kullback-Leibler and Gibbs (Shannon) entropies is explained in case of non-equilibrium Smoluchowski processes. A very different temporal behav...

2002
Jason K. Johnson

This paper considers an information theoretic min-max approach to the model selection problem. The aim of this approach is to select the member of a given parameterized family of probability models so as to minimize the worst-case KullbackLeibler divergence from an uncertain “truth” model. Uncertainty of the truth is specified by an upper-bound of the KL-divergence relative to a given reference...

2008
Miroslav Kárný Josef Andrýsek

Non-symmetric Kullback–Leibler divergence (KLD) measures proximity of probability density functions (pdfs). Bernardo (Ann. Stat. 1979; 7(3):686–690) had shown its unique role in approximation of pdfs. The order of the KLD arguments is also implied by his methodological result. Functional approximation of estimation and stabilized forgetting, serving for tracking of slowly varying parameters, us...

Journal: :CoRR 2015
Lan Yang Jingbin Wang Yujin Tu Prarthana Mahapatra Nelson Cardoso

This paper proposes a new method for vector quantization by minimizing the Kullback-Leibler Divergence between the class label distributions over the quantization inputs, which are original vectors, and the output, which is the quantization subsets of the vector set. In this way, the vector quantization output can keep as much information of the class label as possible. An objective function is...

Journal: :IEEE Trans. Automat. Contr. 2008
Augusto Ferrante Michele Pavon Federico Ramponi

In this paper, we study a matricial version of a generalized moment problem with degree constraint. We introduce a new metric on multivariable spectral densities induced by the family of their spectral factors, which, in the scalar case, reduces to the Hellinger distance. We solve the corresponding constrained optimization problem via duality theory. A highly nontrivial existence theorem for th...

Journal: :IEEE Trans. Information Theory 2003
Tryphon T. Georgiou Anders Lindquist

We introduce a Kullback-Leibler type distance between spectral density functions of stationary stochastic processes and solve the problem of optimal approximation of a given spectral density Ψ by one that is consistent with prescribed second-order statistics. In general, such statistics are expressed as the state covariance of a linear filter driven by a stochastic process whose spectral densit...

Journal: :Pattern Recognition 2017
Moacir Ponti Josef Kittler Mateus Riva Teófilo Emídio de Campos Cemre Zor

In decision making systems involving multiple classifiers there is the need to assess classifier (in)congruence, that is to gauge the degree of agreement between their outputs. A commonly used measure for this purpose is the Kullback-Leibler (KL) divergence. We propose a variant of the KL divergence, named decision cognizant Kullback-Leibler divergence (DC-KL), to reduce the contribution of the...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید