نتایج جستجو برای: entropy estimate

تعداد نتایج: 291306  

Journal: :J. Sci. Comput. 2002
Gabriella Puppo

In this work, we describe the behaviour of the numerical cell entropy production for several schemes. The numerical results we show indicate that numerical entropy production can be used to estimate the local error in regions of smoothness and to locate shocks. Thus the numerical entropy production can be computed at each time step to monitor the numerical solution produced by a scheme. The inf...

Journal: :Physical review. E, Statistical, nonlinear, and soft matter physics 2012
Martin Vinck Francesco P Battaglia Vladimir B Balakirsky A J Han Vinck Cyriel M A Pennartz

Estimating entropy from empirical samples of finite size is of central importance for information theory as well as the analysis of complex statistical systems. Yet, this delicate task is marred by intrinsic statistical bias. Here we decompose the entropy function into a polynomial approximation function and a remainder function. The approximation function is based on a Taylor expansion of the ...

1995
Paul A. Viola Nicol N. Schraudolph Terrence J. Sejnowski

No finite sample is sufficient to determine the density, and therefore the entropy, of a signal directly. Some assumption about either the functional form of the density or about its smoothness is necessary. Both amount to a prior over the space of possible density functions. By far the most common approach is to assume that the density has a parametric form. By contrast we derive a differentia...

2010
Yudai Honma Osamu Kurita Azuma Taguchi Y. Honma O. Kurita A. Taguchi

Abstract In this paper, we propose a new spatial interaction model for trip-chaining behavior that consists of a sequence of movements. Particularly, including the origin-destination constraints, we generalize the traditional entropy maximizing model to deal with trip-chaining behaviors. Traditional entropy models should be noted in terms of a theoretical derivation of the gravity model and its...

Journal: :Entropy 2018
Michal Munk Lubomír Benko

The paper is focused on an examination of the use of entropy in the field of web usage mining. Entropy creates an alternative possibility of determining the ratio of auxiliary pages in the session identification using the Reference Length method. The experiment was conducted on two different web portals. The first log file was obtained from a course of virtual learning environment web portal. T...

2016
Young-Seok Choi

This paper presents a multiscale information measure of Electroencephalogram (EEG) for analysis with a short data length. A multiscale extension of permutation entropy (MPE) is capable of fully reflecting the dynamical characteristics of EEG across different temporal scales. However, MPE yields an imprecise estimation due to coarse-grained procedure at large scales. We present an improved MPE m...

1996
Paul Viola Nicol N. Schraudolph Terrence J. Sejnowski

No nite sample is suucient to determine the density, and therefore the entropy, of a signal directly. Some assumption about either the functional form of the density or about its smoothness is necessary. Both amount to a prior over the space of possible density functions. By far the most common approach is to assume that the density has a parametric form. By contrast we derive a diierential lea...

2000
A. van de Walle G. Ceder

Experimental as well as theoretical work indicates that the relative stability of the ordered and the disordered states of a compound may be significantly affected by their difference in vibrational entropy. The origin of this difference is usually attributed to the fact that disordering reduces the number of stiff bonds between different atomic species in favor of soft bonds between identical ...

Journal: :Entropy 2013
Maria Teresa Giraudo Laura Sacerdote Roberta Sirovich

A new, non–parametric and binless estimator for the mutual information of a d–dimensional random vector is proposed. First of all, an equation that links the mutual information to the entropy of a suitable random vector with uniformly distributed components is deduced. When d = 2 this equation reduces to the well known connection between mutual information and entropy of the copula function ass...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید