نتایج جستجو برای: entropy measure
تعداد نتایج: 404955 فیلتر نتایج به سال:
Recently Lewis Bowen introduced a notion of entropy for measure-preserving actions of a countable sofic group on a standard probability space admitting a generating partition with finite entropy. By applying an operator algebra perspective we develop a more general approach to sofic entropy which produces both measure and topological dynamical invariants. We establish the variational principle ...
Entropy methods (approximate and sample entropy) have been studied to measure the complexity or predictability of finite length time series. The identification of parameters of this entropy family is indispensable task to enable the measure of predictability of time-series data. So far, there have been no general rules to select these parameters; they rather depend on particular problems. In th...
ross-entropy is a measure of the difference between two distribution functions. In order to deal with the divergence of uncertain variables via uncertainty distributions, this paper aims at introducing the concept of cross-entropy for uncertain variables based on uncertain theory, as well as investigating some mathematical properties of this concept. Several practical examples are also provided...
The complexity of a Boolean function can be expressed in terms of computational work. We present experimental data in support of the entropy definition of computational work based upon the Inputoutput description of a Boolean function. Our data show a linear relationship between the computational work and the average number of literals in a multi-level implementation. The Investigation includes...
Jeff W.T. Kan and John S. Gero Krasnow Institute for Advanced Study, USA [email protected]; [email protected] This paper presents a case study examining Shannon’s entropy as a tool for measuring design creativity from protocols. Two design sessions were analyzed. One of the sessions was judged to be more creative based on their outcomes and qualitative analyses. We calculated the text entropy ...
In the fuzzy set theory, information measures play a paramount role in several areas such as decision making, pattern recognition etc. In this paper, similarity measure based on cosine function and entropy measures based on logarithmic function for IFSs are proposed. Comparisons of proposed similarity and entropy measures with the existing ones are listed. Numerical results limpidly betoken th...
If A is a variable of any kind, the entropy H(A) is a measure of its vanety. It shows how much the various appearances of A differ from each other, whatever be the kind and nature of these appearances. For a quantifiable variable, entropy is just another measure of variance. But entropy can be used, as a measure of variety, for qualitative variables as well. If A and B are two variables, the de...
Abstract. We define a classical probability analog of Voiculescu’s free entropy dimension that we shall call the classical probability entropy dimension. We show that the classical probability entropy dimension is related with diverse other notions of dimension. First, it equals the fractal dimension. Second, if one extends Bochner’s inequalities to a measure by requiring that microstates aroun...
First, this paper recalls a recently introduced method of adaptive monitoring of dynamical systems and presents the most recent extension with a multiscale-enhanced approach. Then, it is shown that this concept of real-time data monitoring establishes a novel non-Shannon and non-probabilistic concept of novelty quantification, i.e., Entropy of Learning, or in short the Learning Entropy. This no...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید