نتایج جستجو برای: kullback leibler

تعداد نتایج: 7228  

2003

Deconvolution is usually regarded as one of the ill-posed problems in applied mathematics if no constraints on the unknowns are assumed. In this paper, we discuss the idea of welldefined statistical models being a counterpart of the notion of well-posedness. We show that constraints on the unknowns such as positivity and sparsity can go a long way towards overcoming the ill-posedness in deconvo...

Journal: :Speech Communication 1998
Kazuhiro Arai Jeremy H. Wright Giuseppe Riccardi Allen L. Gorin

A new method for automatically acquiring grammar fragments for understanding uently spoken language is proposed. The goal of this method is to generate a collection of grammar fragments each representing a set of syntactically and semantically similar phrases. First phrases observed frequently in the training set are selected as candidates. Each candidate phrase has three associated probability...

2007
Patrick Marsh Peter Phillips Robert Taylor

This paper details the differential and numeric properties of two measures of entropy, Shannon entropy and Kullback-Leibler distance, applicable for the unit root hypothesis. It is found that they are differentiable functions of the degree of trending in any included deterministic component and of the correlation of the underlying innovations. Moreover, Shannon entropy is concave in these, and ...

Journal: :Physical review. E, Statistical, nonlinear, and soft matter physics 2010
Saar Rahav Shaul Mukamel

By subjecting a dynamical system to a series of short pulses and varying several time delays, we can obtain multidimensional characteristic measures of the system. Multidimensional Kullback-Leibler response function (KLRF), which are based on the Kullback-Leibler distance between the initial and final states, are defined. We compare the KLRF, which are nonlinear in the probability density, with...

Journal: :European Journal of Operational Research 2010
M. J. Rufo Carlos J. Perez Jacinto Martín

In this paper, a general approach is proposed to address a full Bayesian analysis for the class of quadratic natural exponential families in the presence of several expert sources of prior information. By expressing the opinion of each expert as a conjugate prior distribution, a mixture model is used by the decision maker to arrive at a consensus of the sources. A hyperprior distribution on the...

2014
Steeve Zozor Jean-Marc Brossier

In this paper we propose a generalization of the usual deBruijn identity that links the Shannon differential entropy (or the Kullback–Leibler divergence) and the Fisher information (or the Fisher divergence) of the output of a Gaussian channel. The generalization makes use of φ -entropies on the one hand, and of φ -divergences (of the Csizàr class) on the other hand, as generalizations of the S...

Journal: :Entropy 2014
Keisuke Yano Fumiyasu Komaki

We investigate the asymptotic construction of constant-risk Bayesian predictive densities under the Kullback–Leibler risk when the distributions of data and target variables are different and have a common unknown parameter. It is known that the Kullback–Leibler risk is asymptotically equal to a trace of the product of two matrices: the inverse of the Fisher information matrix for the data and ...

Journal: :CoRR 2016
Sukanya Patil Ajit Rajwade

Reconstruction error bounds in compressed sensing under Gaussian or uniform bounded noise do not translate easily to the case of Poisson noise. Reasons for this include the signal dependent nature of Poisson noise, and also the fact that the negative log likelihood in case of a Poisson distribution (which is directly related to the generalized Kullback-Leibler divergence) is not a metric and do...

2006
Rani Nelken Stuart M. Shieber

Kullback-Leibler divergence is a natural distance measure between two probabilistic finite-state automata. Computing this distance is difficult, since it requires a summation over a countably infinite number of strings. Nederhof and Satta (2004) recently provided a solution in the course of solving the more general problem of finding the cross-entropy between a probabilistic context-free gramma...

Journal: :Signal Processing 2010
Abd-Krim Seghouane

The Akaike information criterion, AIC, and its corrected version, AICc are two methods for selecting normal linear regression models. Both criteria were designed as estimators of the expected Kullback–Leibler information between the model generating the data and the approximating candidate model. In this paper, two new corrected variants of AIC are derived for the purpose of small sample linear...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید