نتایج جستجو برای: approximate entropy

تعداد نتایج: 138522  

Journal: :Discrete Mathematics & Theoretical Computer Science 2001
Pascal Koiran

We show that it is impossible to compute (or even to approximate) the topological entropy of a continuous piecewise affine function in dimension four. The same result holds for saturated linear functions in unbounded dimension. We ask whether the topological entropy of a piecewise affine function is always a computable real number, and conversely whether every non-negative computable real numbe...

Journal: :Oper. Res. Lett. 2008
Jean Cardinal Samuel Fiorini Gwenaël Joret

We study graph orientations that minimize the entropy of the in-degree sequence. The problem of finding such an orientation is an interesting special case of the minimum entropy set cover problem previously studied by Halperin and Karp (Theoret. Comput. Sci., 2005) and by the current authors (Algorithmica, to appear). We prove that the minimum entropy orientation problem is NP-hard even if the ...

2008
Aziz Madrane Eitan Tadmor

One reason which makes Roe’s Riemann solver attractive is its low computational cost. But the main drawback with Roe’s approximate Riemann solver is that non-physical expansion shocks can occur in the sonic points, it has been early remarked that for this particular situation. The Roe flux does not satisfy the entropy condition. In this paper an elegant response has been proposed by combining H...

2011
Wen Shen Tianyou Zhang

In this paper we review some results on a model for the erosion of mountain profile caused by small avalanches. The equation is a scalar conservation law with a non-local flux. Under suitable assumptions on the erosion rate, the mountain profile develops three types of singularities, namely kinks, shocks and hyper-kinks. Entropy weak solutions to the Cauchy problem can be constructed globally i...

2009
Zoltán Szabó András Lörincz

Goal: estimation of high dimensional information theoretical quantities (entropy, mutual information, divergence). • Problem: computation/estimation is quite slow. • Consistent estimation is possible by nearest neighbor (NN) methods [1] → pairwise distances of sample points: – expensive in high dimensions [2], – approximate isometric embedding into low dimension is possible (Johnson-Lindenstrau...

2004
Jian-Wu Xu Deniz Erdogmus Yadunandana N. Rao José Carlos Príncipe

Recently, the authors developed the Minimax Mutual Information algorithm for linear ICA of real-valued mixtures, which is based on a density estimate stemming from Jaynes’ maximum entropy principle. Since the entropy estimates result in an approximate upper bound for the actual mutual information of the separated outputs, minimizing this upper bound results in a robust performance and good gene...

Journal: :J. Comput. Physics 2016
Claudio Bierig Alexey Chernov

We develop a complete convergence theory for the Maximum Entropy method based on moment matching for a sequence of approximate statistical moments estimated by the Multilevel Monte Carlo method. Under appropriate regularity assumptions on the target probability density function, the proposed method is superior to the Maximum Entropy method with moments estimated by the Monte Carlo method. New t...

2016
A. Rashad M. Mahmoud M. Yusuf

In this paper we develop approximate Bayes estimators of the two parameters logistic distribution. Lindley’s approximation and importance sampling techniques are applied. The Gaussian-gamma prior distribution and progressively type-II censored samples are assumed. Quadratic, linex and general entropy loss functions are used. The statistical performances of the Bayes estimates under quadratic, l...

2009
R. E. ULANOWICZ

Classically, the increase of entropy implies an ineluctable dissipation of energy and materials into what is known as ‘heat death’. A strictly logical take on the Boltzmann entropy reveals, however, that the measure amalgamates order with disorganization. Hence, under some nonequilibrium circumstances, the production of order becomes an inevitable feature of increasing entropy. In particular, p...

2002
Dekai Wu

Statistical approaches to natural language parsing and interpretation have a number of advantages but thus far have failed to incorporate compositional generalizations found in traditional structural models. A major reason for this is the inability of most statistical language models being used to represent relational constraints, the connectionist variable binding problem being a prominent cas...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید