نتایج جستجو برای: kullback leibler
تعداد نتایج: 7228 فیلتر نتایج به سال:
In this paper, by conditioning on the matrix variate normal distribution (MVND) the construction of the matrix t-type family is considered, thus providing a new perspective of this family. Some important statistical characteristics are given. The presented t-type family is an extension to the work of Dickey [8]. A Bayes estimator for the column covariance matrix &Sigma of MVND is derived under ...
We analyze a contrasting dynamical behavior of Gibbs–Shannon and conditional Kullback-Leibler entropies, induced by time-evolution of continuous probability distributions. The question of predominantly purposedependent entropy definition for non-equilibriummodel systems is addressed. The conditional Kullback–Leibler entropy is often believed to properly capture physical features of an asymptoti...
We give a detailed analysis of the Gibbs-type entropy notion and its dynamical behavior in case of time-dependent continuous probability distributions of varied origins: related to classical and quantum systems. The purpose-dependent usage of conditional Kullback-Leibler and Gibbs (Shannon) entropies is explained in case of non-equilibrium Smoluchowski processes. A very different temporal behav...
This paper considers an information theoretic min-max approach to the model selection problem. The aim of this approach is to select the member of a given parameterized family of probability models so as to minimize the worst-case KullbackLeibler divergence from an uncertain “truth” model. Uncertainty of the truth is specified by an upper-bound of the KL-divergence relative to a given reference...
Non-symmetric Kullback–Leibler divergence (KLD) measures proximity of probability density functions (pdfs). Bernardo (Ann. Stat. 1979; 7(3):686–690) had shown its unique role in approximation of pdfs. The order of the KLD arguments is also implied by his methodological result. Functional approximation of estimation and stabilized forgetting, serving for tracking of slowly varying parameters, us...
This paper proposes a new method for vector quantization by minimizing the Kullback-Leibler Divergence between the class label distributions over the quantization inputs, which are original vectors, and the output, which is the quantization subsets of the vector set. In this way, the vector quantization output can keep as much information of the class label as possible. An objective function is...
In this paper, we study a matricial version of a generalized moment problem with degree constraint. We introduce a new metric on multivariable spectral densities induced by the family of their spectral factors, which, in the scalar case, reduces to the Hellinger distance. We solve the corresponding constrained optimization problem via duality theory. A highly nontrivial existence theorem for th...
We introduce a Kullback-Leibler type distance between spectral density functions of stationary stochastic processes and solve the problem of optimal approximation of a given spectral density Ψ by one that is consistent with prescribed second-order statistics. In general, such statistics are expressed as the state covariance of a linear filter driven by a stochastic process whose spectral densit...
In decision making systems involving multiple classifiers there is the need to assess classifier (in)congruence, that is to gauge the degree of agreement between their outputs. A commonly used measure for this purpose is the Kullback-Leibler (KL) divergence. We propose a variant of the KL divergence, named decision cognizant Kullback-Leibler divergence (DC-KL), to reduce the contribution of the...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید