نتایج جستجو برای: entropy estimate

تعداد نتایج: 291306  

Journal: :Discrete Applied Mathematics 2004
Vladimir N. Potapov

The problem of non-distorting compression (or coding) of sequences of symbols is considered. For sequences of asymptotically zero empirical entropy, a modi0cation of the Lempel–Ziv coding rule is o1ered whose coding cost is at most a 0nite number of times worse than the optimum. A combinatorial proof is o1ered for the well-known redundancy estimate of the Lempel–Ziv coding algorithm for sequenc...

2016
Hideitsu Hino Shotaro Akaho Noboru Murata

A method for estimating Shannon differential entropy is proposed based on the second order expansion of the probability mass around the inspection point with respect to the distance from the point. Polynomial regression with Poisson error structure is utilized to estimate the values of density function. The density estimates at every given data points are averaged to obtain entropy estimators. ...

1996
Qin Zhang John M. Danskin

In this paper, we introduce a pattern matching algorithm used in document image compression. This pattern matching algorithm uses the cross entropy between two patterns as the criterion for a match. We use a physical model which is based on the nite resolution of the scanner (spatial sampling error) to estimate the probability values used in cross entropy calculation. Experimental results show ...

2017
Valerio Lucarini

9 We extend the analysis of the thermodynamics of the climate system by investigating the role 10 played by processes taking place at various spatial and temporal scales through a procedure of 11 coarse graining. We show that the coarser is the graining of the climatic fields, the lower is 12 the resulting estimate of the material entropy production. In other terms, all the spatial and 13 tempo...

2008
F. Aurzada

We investigate the small deviation probabilities of a class of very smooth stationary Gaussian processes playing an important role in Bayesian statistical inference. Our calculations are based on the appropriate modification of the entropy method due to Kuelbs, Li, and Linde as well as on classical results about the entropy of classes of analytic functions. They also involve Tsirelson’s upper b...

1980
Jose A. Costa

We propose a new algorithm that simultaneously estimates the intrinsic dimension and intrinsic entropy of random data sets lying on smooth manifolds. The method is based on asymptotic properties of entropic graph constructions. In particular, we compute the Euclidean -nearest neighbors ( NN) graph over the sample points and use its overall total edge length to estimate intrinsic dimension and e...

2017

Could scientists use the Second Law of Thermodynamics on your chewing muscles to work out when you are going to die? According to research published in the International Journal of Exergy, the level of entropy, or thermodynamic disorder, in the chewing muscles in your jaw increases with each mouthful. This entropy begins to accumulate from the moment you're "on solids" until your last meal, but...

Journal: :Physical review. E, Statistical, nonlinear, and soft matter physics 2010
György Szabó Tânia Tomé István Borsos

The structure of probability currents is studied for the dynamical network after consecutive contraction on two-state, nonequilibrium lattice systems. This procedure allows us to investigate the transition rates between configurations on small clusters and highlights some relevant effects of lattice symmetries on the elementary transitions that are responsible for entropy production. A method i...

2004
Jian-Wu Xu Deniz Erdogmus Yadunandana N. Rao José Carlos Príncipe

Recently, the authors developed the Minimax Mutual Information algorithm for linear ICA of real-valued mixtures, which is based on a density estimate stemming from Jaynes’ maximum entropy principle. Since the entropy estimates result in an approximate upper bound for the actual mutual information of the separated outputs, minimizing this upper bound results in a robust performance and good gene...

2006
Lei Ni Peter Li LEI NI Richard Hamilton

In this paper we prove a new matrix Li-Yau-Hamilton (LYH) estimate for Kähler-Ricci flow on manifolds with nonnegative bisectional curvature. The form of this new LYH estimate is obtained by the interpolation consideration originated in [Ch] by Chow. This new inequality is shown to be connected with Perelman’s entropy formula through a family of differential equalities. In the rest of the paper...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید