نتایج جستجو برای: معیار kullback

تعداد نتایج: 34836  

Journal: :IEEE Trans. Information Theory 2014
Tim van Erven Peter Harremoës

Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends on a parameter that is called its order. In particular, the Rényi divergence of order 1 equals the Kullback-Leibl...

Journal: :IEEE Transactions on Automatic Control 2015

Journal: :iranian journal of science and technology (sciences) 2006
a. r. soleimani

scott and szewczyk in technometrics, 2001, have introduced a similarity measure for twodensities f1 and f2 , by1, 21 21 1 2 2( , ), ,f fsim f ff f f f< >=< >< >wheref1, f2 f1(x, θ1)f2(x, θ2)dx.+∞−∞< >=∫sim(f1, f2) has some appropriate properties that can be suitable measures for the similarity of f1 and f2 .however, due to some restrictions on the value of parameters and the kind of densities, ...

2007
M. Tumminello F. Lillo R. N. Mantegna

The problem of filtering information from large correlation matrices is of great importance in many applications. We have recently proposed the use of the Kullback–Leibler distance to measure the performance of filtering algorithms in recovering the underlying correlation matrix when the variables are described by a multivariate Gaussian distribution. Here we use the Kullback–Leibler distance t...

2006
Meral Candan Çetin Aydin Erar

In this paper, the problem of variable selection in linear regression is considered. This problem involves choosing the most appropriate model from the candidate models. Variable selection criteria based on estimates of the Kullback-Leibler information are most common. Akaike’s AIC and bias corrected AIC belong to this group of criteria. The reduction of the bias in estimating the Kullback-Leib...

Journal: :IEEE Transactions on Information Theory 2000

Journal: :SIAM Journal on Scientific Computing 2021

We propose to compute a sparse approximate inverse Cholesky factor $L$ of dense covariance matrix $\Theta$ by minimizing the Kullback--Leibler divergence between Gaussian distributions $\mathcal{N}(0, \Theta)$ and L^{-\top} L^{-1})$, subject sparsity constraint. Surprisingly, this problem has closed-form solution that can be computed efficiently, recovering popular Vecchia approximation in spat...

2006
Young Kyung Lee Byeong U. Park

Motivated from the bandwidth selection problem in local likelihood density estimation and from the problem of assessing a final model chosen by a certain model selection procedure, we consider estimation of the Kullback–Leibler divergence. It is known that the best bandwidth choice for the local likelihood density estimator depends on the distance between the true density and the ‘vehicle’ para...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید