نتایج جستجو برای: g entropy

تعداد نتایج: 504631  

2011
JAMES ALEXANDER

The enumeration of independent sets in graphs with various restrictions has been a topic of much interest of late. Let i(G) be the number of independent sets in a graph G and let it(G) be the number of independent sets in G of size t. Kahn used entropy to show that if G is an r-regular bipartite graph with n vertices, then i(G) ≤ i(Kr,r). Zhao used bipartite double covers to extend this bound t...

Journal: :Combinatorics, Probability & Computing 2012
Hiu-Fai Law Colin McDiarmid

The enumeration of independent sets in graphs with various restrictions has been a topic of much interest of late. Let i(G) be the number of independent sets in a graph G and let it(G) be the number of independent sets in G of size t. Kahn used entropy to show that if G is an r-regular bipartite graph with n vertices, then i(G) 6 i(Kr,r). Zhao used bipartite double covers to extend this bound t...

Journal: :مجله علوم آماری 0
آرزو حبیبی راد arezoo habibi rad department of statistics, ferdowsi university of mashhad, mashhad, iran.گروه آمار، دانشگاه فردوسی مشهد ناصررضا ارقامی naser reza arghami department of statistics, ferdowsi university of mashhad, mashhad, iran.گروه آمار، دانشگاه فردوسی مشهد

the estimate of entropy (sample entropy), has been introduced by vasicek (1976), for the first time. in this paper, we provide an estimate of entropy of order statistics, that is the extention of the entropy estimate. then we present an application of the entropy estimate of order statistics as a test statistic for symmetry of distribution versus skewness. the proposed test has been compared wi...

2017
Maciej Obremski Maciej Skorski

We revisit the problem of estimating entropy of discrete distributions from independent samples, studied recently by Acharya, Orlitsky, Suresh and Tyagi (SODA 2015), improving their upper and lower bounds on the necessary sample size n. For estimating Renyi entropy of order α, up to constant accuracy and error probability, we show the following Upper bounds n = O(1) · 2(1− 1 α )Hα for integer α...

Journal: :Physical review letters 2004
G C Levine

Boundary impurities are known to dramatically alter certain bulk properties of (1+1)-dimensional strongly correlated systems. The entanglement entropy of a zero temperature Luttinger liquid bisected by a single impurity is computed using a novel finite size scaling or bosonization scheme. For a Luttinger liquid of length 2L and UV cutoff epsilon, the boundary impurity correction (deltaSimp) to ...

Journal: :Entropy 2018
Jacques Demongeot Mariem Jelassi Hana Hazgui Slimane Ben Miled Narjès Bellamine Ben Saoud Carla Taramasco

Networks used in biological applications at different scales (molecule, cell and population) are of different types: neuronal, genetic, and social, but they share the same dynamical concepts, in their continuous differential versions (e.g., non-linear Wilson-Cowan system) as well as in their discrete Boolean versions (e.g., non-linear Hopfield system); in both cases, the notion of interaction g...

M. Manajjem R. Zhisni

Polycyclic Aromatic Hydrocarbon (PAH) plays an important role in the formation of combustion-generatedparticles such as soot, and their presence in atmosphere aerosols has been widely shown. The formation of five-membered rings, detected in combustion effluent, is of great interest due to their genotoxic activity.The present study reports an investigation of the electronic structure of Acenapht...

Journal: :CoRR 2012
Varun Jog Venkat Anantharam

Shannon’s Entropy Power Inequality can be viewed as characterizing the minimum differential entropy achievable by the sum of two independent random variables with fixed differential entropies. The entropy power inequality has played a key role in resolving a number of problems in information theory. It is therefore interesting to examine the existence of a similar inequality for discrete random...

Journal: :IEEE Trans. Information Theory 2013
Varun Jog Venkat Anantharam

Shannon’s Entropy Power Inequality (EPI) can be viewed as characterizing the minimum differential entropy achievable by the sum of two independent random variables with fixed differential entropies. The EPI is a powerful tool and has been used to resolve a number of problems in information theory. In this paper we examine the existence of a similar entropy inequality for discrete random variabl...

2017
Marianne Akian Stéphane Gaubert Julien Grand-Clément Jérémie Guillaud

Entropy games and matrix multiplication games have been recently introduced by Asarin et al. They model the situation in which one player (Despot) wishes to minimize the growth rate of a matrix product, whereas the other player (Tribune) wishes to maximize it. We develop an operator approach to entropy games. This allows us to show that entropy games can be cast as stochastic mean payoff games ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید