نتایج جستجو برای: conditional entropy

تعداد نتایج: 122651  

Journal: :SIAM Journal on Scientific Computing 2021

Related DatabasesWeb of Science You must be logged in with an active subscription to view this.Article DataHistorySubmitted: 1 June 2020Accepted: 21 May 2021Published online: 26 October 2021KeywordsHamiltonian Monte Carlo, Kolmogorov--Sinai entropy, Markov chain CarloAMS Subject Headings65C05Publication DataISSN (print): 1064-8275ISSN (online): 1095-7197Publisher: Society for Industrial and App...

2016
Rafael Berkvens Herbert Peremans Maarten Weyn

Localization systems are increasingly valuable, but their location estimates are only useful when the uncertainty of the estimate is known. This uncertainty is currently calculated as the location error given a ground truth, which is then used as a static measure in sometimes very different environments. In contrast, we propose the use of the conditional entropy of a posterior probability distr...

Journal: :SIAM J. Control and Optimization 2015
Igor G. Vladimirov Ian R. Petersen

The paper is concerned with a dissipativity theory and robust performance analysis of discrete-time stochastic systems driven by a statistically uncertain random noise. The uncertainty is quantified by the conditional relative entropy of the actual probability law of the noise with respect to a nominal product measure corresponding to a white noise sequence. We discuss a balance equation, dissi...

2012
R. Sukesh Kumar Jasprit Singh

In this work a novel approach for color image segmentation using higher order entropy as a textural feature for determination of thresholds over a two dimensional image histogram is discussed. A similar approach is applied to achieve multi-level thresholding in both grayscale and color images. The paper discusses two methods of color image segmentation using RGB space as the standard processing...

Journal: :CoRR 2013
Ramon Ferrer-i-Cancho Lukasz Debowski Fermín Moscoso del Prado Martín

Constant entropy rate (conditional entropies must remain constant as the sequence length increases) and uniform information density (conditional probabilities must remain constant as the sequence length increases) are two information theoretic principles that are argued to underlie a wide range of linguistic phenomena. Here we revise the predictions of these principles to the light of Hilberg’s...

Journal: :IEEE Trans. Information Theory 2014
Serge Fehr Stefan Berens

1999
Glen D. Johnson Wayne L. Myers Ganapati P. Patil Charles Taillie Anthony R. Olsen

For landscapes that are cast as categorical raster maps, we present an entropy based method for obtaining a multiresolution characterization of spatial pattern. The result is a conditional entropy profile which reflects the rate of information loss as map resolution is degraded by increasing the pixel size 1 through a resampling filter. We choose a random filter because of desirable properties ...

2003
Partha Pratim Mondal Kanhirodan Rajan

Maximum Likelihood (ML) estimation is extensively used for estimating emission densities from clumped and incomplete nzeasurement data in Positron Emission Tomography (PEU modality. Reconstruction produced by ML-algorithm has been found noisy because it does not make use of available prior knowledge. Bayesian estimation provides such a platform for the inclusion of prior knowledge in the recons...

2003
Dan A. Simovici Szymon Jaroszewicz

We introduce an extension of the notion of Shannon conditional entropy to a more general form of conditional entropy that captures both the conditional Shannon entropy and a similar notion related to the Gini index. The proposed family of conditional entropies generates a collection of metrics over the set of partitions of finite sets, which can be used to construct decision trees. Experimental...

2000
Xiaolin Wu Philip A. Chou Xiaohui Xue

We consider the problem of nding the quantizer Q that quantizes the K-dimensional causal context Ci = (Xi?t i ; Xi?t 2 ; : : : ; Xi?t K) of a source symbol Xi into one of M conditioning states such that the conditional entropy H (XijQ(Ci)) is minimized. The resulting minimum conditional entropy context quantizer can be used for sequential coding of the sequence X0 ; X1 ; X2 ; : : :.

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید