نتایج جستجو برای: entropy estimate

تعداد نتایج: 291306  

Hossein Bevrani, Masoud Ganji, Nasrin Hami Golzar,

Abstract: The exponential distribution is a popular model in applications to real data. We propose a new extension of this distribution, called the Lomax-exponential distribution, which presents greater flexibility to the model. Also there is a simple relation between the Lomax-exponential distribution and the Lomax distribution. Results for moment, limit behavior, hazard function, Shannon entr...

Journal: :Neural computation 2009
Vincent Q. Vu Bin Yu Robert E. Kass

Information estimates such as the direct method of Strong, Koberle, de Ruyter van Steveninck, and Bialek (1998) sidestep the difficult problem of estimating the joint distribution of response and stimulus by instead estimating the difference between the marginal and conditional entropies of the response. While this is an effective estimation strategy, it tempts the practitioner to ignore the ro...

Journal: :Journal of Mathematical Analysis and Applications 2022

In this paper, we establish the entropy-entropy production estimate for ES-BGK model, a generalized version of BGK model Boltzmann equation introduced better approximation in Navier-Stokes limit. Our result improves previous entropy [38] that (1) full range Prandtl parameters −1/2≤ν<1 including critical case ν=−1/2 is covered, and (2) sharper bound obtained. An explicit characterization coeffic...

Journal: :Nephron. Clinical practice 2011
Vianda S Stel Friedo W Dekker Giovanni Tripepi Carmine Zoccali Kitty J Jager

The Kaplan-Meier (KM) method is used to analyze 'time-to-event' data. The outcome in KM analysis often includes all-cause mortality, but could also include other outcomes such as the occurrence of a cardiovascular event. The purpose of this article is to explain the basic concepts of the KM method, to provide some guidance regarding the presentation of the KM results and to discuss some importa...

2008
Noriaki Ogawa Seiji Terashima

In the LLM bubbling geometries, we compute the entropies of black holes and estimate their “horizon” sizes from the fuzzball conjecture, based on coarse-graining on the gravity side. The differences of black hole microstates cannot be seen by classical observations. Conversely, by counting the possible deformations of the geometry which are not classically detectable, we can calculate the entro...

Journal: :Entropy 2015
Deniz Gençaga Kevin H. Knuth William B. Rossow

Information-theoretic quantities, such as entropy and mutual information (MI), can be used to quantify the amount of information needed to describe a dataset or the information shared between two datasets. In the case of a dynamical system, the behavior of the relevant variables can be tightly coupled, such that information about one variable at a given instance in time may provide information ...

Journal: :Frontiers in Neuroinformatics 2021

Calculations of entropy a signal or mutual information between two variables are valuable analytical tools in the field neuroscience. They can be applied to all types data, capture non-linear interactions and model independent. Yet limited size number recordings one collect series experiments makes their calculation highly prone sampling bias. Mathematical methods overcome this so-called “sampl...

, M. Moazamnia S. Sadeghfam Y. Hassanzadeh

Practical concept of velocity distribution of pressure flow in the bends is interesting and hence, the professional engineering design has been investigated in the current study. This paper shows that velocity distribution in the bends can be analyzed in terms of the probability distributions. The concept of entropy based on the probability is an applied and new approach to achieve velocity pro...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید