نتایج جستجو برای: non entropy

تعداد نتایج: 1374176  

2010
CHRISTOS ARVANITIS CHARALAMBOS MAKRIDAKIS NIKOLAOS I. SFAKIANAKIS N. I. Sfakianakis

We consider numerical schemes which combine non-uniform, adaptively redefined spatial meshes with entropy conservative schemes for the evolution step for shock computations. We observe that the resulting adaptive schemes yield approximations free of oscillations in contrast to known fully discrete entropy conservative schemes on uniform meshes. We conclude that entropy conservative schemes are ...

2014
Jordi Solé-Casals Pere Martí-Puig Ramon Reig-Bolaño

In this paper we explore the use of non-linear transformations in order to improve the performance of an entropy based voice activity detector (VAD). The idea of using a non-linear transformation comes from some previous work done in speech linear prediction (LPC) field based in source separation techniques, where the score function was added into the classical equations in order to take into a...

Journal: :Computers & Mathematics with Applications 2010
J. Mohanalin Beenamol Prem Kumar Kalra Nirmal Kumar

This article investigates a novel automatic microcalcification detection method using a type II fuzzy index. The thresholding is performed using the Tsallis entropy characterized by another parameter ‘q’, which depends on the non-extensiveness of a mammogram. In previous studies, ‘q’ was calculated using the histogram distribution, which can lead to erroneous results when pectoral muscles are i...

1993
François BLANCHARD F. BLANCHARD

— A cover of some compact set X by two non dense open sets is called a standard cover. In the cartesian square of a flow (X, T ), pairs (x, x′) outside the diagonal are defined as entropy pairs whenever any standard cover (U, V ) such that (x, x′) ∈ Int(Uc) × Int(V c) has positive entropy. The set of such pairs is nonempty provided h(X, T ) > 0 ; it is T × T -invariant, and all pairs in its clo...

Journal: :Signal Processing 2005
Sarit Shwartz Michael Zibulevsky Yoav Y. Schechner

Differential entropy is a quantity used in many signal processing problems. Often we need to calculate not only the entropy itself, but also its gradient with respect to various variables, for efficient optimization, sensitivity analysis, etc. Entropy estimation can be based on an estimate of the probability density function, which is computationally costly if done naively. Some prior algorithm...

2012
DANIEL J. THOMPSON

Bowen showed that a continuous expansive map with specification has a unique measure of maximal entropy. We show that the conclusion remains true under weaker non-uniform versions of these hypotheses. To this end, we introduce the notions of obstructions to expansivity and specification, and show that if the entropy of such obstructions is smaller than the topological entropy of the map, then t...

Journal: :Entropy 2010
Amir Hossein Darooneh Ghassem Naeimi Ali Mehri Parvin Sadeghi

Non-extensive statistical mechanics appears as a powerful way to describe complex systems. Tsallis entropy, the main core of this theory has been remained as an unproven assumption. Many people have tried to derive the Tsallis entropy axiomatically. Here we follow the work of Wang (EPJB, 2002) and use the incomplete information theory to retrieve the Tsallis entropy. We change the incomplete in...

1993
J. Pérez–Mercader

We define an entropy for a quantum field theory by combining quantum fluctuations, scaling and the maximum entropy concept. This entropy has different behavior in asymptotically free and non–asymptotically free theories. We find that the transition between the two regimes (from the asymptotically free to the non–asymptotically free) takes place via a continuous phase transition. For asymptotica...

2009
Xiao-yu Chen

In quantum information theory, von Neumann entropy plays an important role. The entropies can be obtained analytically only for a few states. In continuous variable system, even evaluating entropy numerically is not an easy task since the dimension is infinite. We develop the perturbation theory systematically for calculating von Neumann entropy of non-degenerate systems as well as degenerate s...

2006
Ambedkar Dukkipati Shalabh Bhatnagar

The measure-theoretic definition of Kullback-Leibler relative-entropy (KL-entropy) plays a basic role in the definitions of classical information measures. Entropy, mutual information and conditional forms of entropy can be expressed in terms of KL-entropy and hence properties of their measure-theoretic analogs will follow from those of measure-theoretic KL-entropy. These measure-theoretic defi...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید