نتایج جستجو برای: entropy estimate

تعداد نتایج: 291306  

Journal: :J. Nonlinear Science 2010
Lucio M. Calcagnile Stefano Galatolo Giulia Menconi

We numerically test the method of non-sequential recursive pair substitutions to estimate the entropy of an ergodic source. We compare its performance with other classical methods to estimate the entropy (empirical frequencies, return times, Lyapunov exponent). We considered as a benchmark for the methods several systems with different statistical properties: renewal processes, dynamical system...

Journal: :SIAM J. Numerical Analysis 2014
Laurent Monasse Régis Monneau

In this paper, we consider diagonal hyperbolic systems with monotone continuous initial data. We propose a natural semi-explicit and upwind first order scheme. Under a certain non-negativity condition on the Jacobian matrix of the velocities of the system, there is a gradient entropy estimate for the hyperbolic system. We show that our scheme enjoys a similar gradient entropy estimate at the di...

2008
GAVIN BAND

The topological entropy of a braid is the infimum of the entropies of all homeomorphisms of the disk which have a finite invariant set represented by the braid. When the isotopy class represented by the braid is pseudo-Anosov or is reducible with a pseudoAnosov component, this entropy is positive. Fried and Kolev proved that the entropy is bounded below by the logarithm of the spectral radius o...

2005
E. A. CARLEN

We prove a lower bound on the rate of relaxation to equilibrium in the H1 norm for a thin film equation. We find a two stage relaxation, with power law decay in an initial interval, followed by exponential decay, at an essentially optimal rate, for large times. The waiting time until the exponential decay sets in is explicitly estimated.

2006
F. C. Chittaro

In the paper we present a generalization to Hamiltonian flows on symplectic manifolds of the estimate proved by Ballmann and Wojtkovski in [4] for the dynamical entropy of the geodesic flow on a compact Riemannian manifold of nonpositive sectional curvature. Given such a Riemannian manifold M, Ballmann and Wojtkovski proved that the dynamical entropy hμ of the geodesic flow on M satisfies the f...

Journal: :CoRR 2008
Ping Li

Compressed Counting (CC) was recently proposed for approximating the αth frequency moments of data streams, for 0 < α ≤ 2. Under the relaxed strict-Turnstile model, CC dramatically improves the standard algorithm based on symmetric stable random projections, especially as α → 1. A direct application of CC is to estimate the entropy, which is an important summary statistic in Web/network measure...

Journal: :Entropy 2008
Julian Sorensen

At the heart of many ICA techniques is a nonparametric estimate of an information measure, usually via nonparametric density estimation, for example, kernel density estimation. While not as popular as kernel density estimators, orthogonal functions can be used for nonparametric density estimation (via a truncated series expansion whose coefficients are calculated from the observed data). While ...

2008
Ximing Wu Jeffrey M. Perloff Amos Golan Yuichi Kitamura

We develop a GMM estimator for the distribution of a variable where summary statistics are available only for intervals of the random variable. Without individual data, once cannot calculate the weighting matrix for the GMM estimator. Instead, we propose a simulated weighting matrix based on a first-step consistent estimate. When the functional form of the underlying distribution is unknown, we...

Journal: :CoRR 2017
Yan Zhang Mete Ozay Zhun Sun Takayuki Okatani

In this paper, we suggest a framework to make use of mutual information as a regularization criterion to train Auto-Encoders (AEs). In the proposed framework, AEs are regularized by minimization of the mutual information between input and encoding variables of AEs during the training phase. In order to estimate the entropy of the encoding variables and the mutual information, we propose a non-p...

Journal: :Computational Statistics & Data Analysis 2015
Hideitsu Hino Kensuke Koshijima Noboru Murata

Estimators for differential entropy are proposed. The estimators are based on the second order expansion of the probability mass around the inspection point with respect to the distance from the point. Simple linear regression is utilized to estimate the values of density function and its second derivative at a point. After estimating the values of the probability density function at each of th...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید