نتایج جستجو برای: rényi

تعداد نتایج: 6772  

2015
Amos J. Storkey Zhanxing Zhu Jinli Hu

Trading in information markets, such as machine learning markets, has been shown to be an effective approach for aggregating the beliefs of different agents. In a machine learning context, aggregation commonly uses forms of linear opinion pools, or logarithmic (log) opinion pools. It is interesting to relate information market aggregation to the machine learning setting. In this paper we introd...

Journal: :Advances in Complex Systems 2005
Matt Davison J. S. Shiner

Landsberg’s notion of disorder, entropy normalized to maximum entropy, was originally proposed for the Shannon information-theoretic entropy to overcome deficiencies of entropy as a measure of disorder due to extensivity. We generalize Landsberg’s concept to three classes of extended entropies: Rényi, Tsallis and Landsberg-Vedral. Three examples are treated, including one based on the logistic ...

Journal: :Journal of Graph Theory 2007
Dhruv Mubayi Jason Williford

The Erdős-Rényi and Projective Norm graphs are algebraically defined graphs that have proved useful in supplying constructions in extremal graph theory and Ramsey theory. Their eigenvalues have been computed and this yields an upper bound on their independence number. Here we show that in many cases, this upper bound is sharp in order of magnitude. Our result for the Erdős-Rényi graph has the f...

Journal: :IEEE Trans. Information Theory 2015
Milán Mosonyi

We show two-sided bounds between the traditional quantum Rényi divergences and the new notion of Rényi divergences introduced recently in Müller-Lennert, Dupuis, Szehr, Fehr and Tomamichel, J. Math. Phys. 54, 122203, (2013), and Wilde, Winter, Yang, arXiv:1306.1586. The bounds imply that the two versions can be used interchangeably near α = 1, and hence one can benefit from the best properties ...

2012
László Erdős Antti Knowles Horng-Tzer Yau Jun Yin

We consider the ensemble of adjacency matrices of Erdős-Rényi random graphs, i.e. graphs on N vertices where every edge is chosen independently and with probability p ≡ p(N). We rescale the matrix so that its bulk eigenvalues are of order one. Under the assumption pN N, we prove the universality of eigenvalue distributions both in the bulk and at the edge of the spectrum. More precisely, we pro...

Journal: :Open Syst. Inform. Dynam. 2004
B. H. Lavenda

PAE cannot be made a basis for either a generalized statistical mechanics or a generalized information theory. Either statistical independence must be waived, or the expression of the averaged conditional probability as the difference between the marginal and joint entropies must be relinquished. The same inequality, relating the PAE to the Rényi entropy, when applied to the mean code length pr...

2014
Amos J. Storkey Zhanxing Zhu Jinli Hu

This is a preprint, and does not constitute publication, but is a provided for the benefit of attendees to accompany a talk at the ICML Workshop on Divergence Methods for Probabilistic Inference. If you wish to reference this paper, please reference the final published version. Machine learning models rely heavily on two compositional methods: mixtures and products. Probabilistic aggregation al...

Journal: :Physical review. E 2017
Vincenzo Alba

In recent years entanglement measures, such as the von Neumann and the Rényi entropies, provided a unique opportunity to access elusive features of quantum many-body systems. However, extracting entanglement properties analytically, experimentally, or in numerical simulations can be a formidable task. Here, by combining the replica trick and the Jarzynski equality we devise an alternative effec...

Journal: :EURASIP J. Adv. Sig. Proc. 2004
Daniel Nicorici Jaakko Astola

Heterogeneous DNA sequences can be partitioned into homogeneous domains that are comprised of the four nucleotides A, C, G, and T and the stop codons. Recursively, we apply a new entropic segmentation method on DNA sequences using Jensen-Shannon and Jensen-Rényi divergences in order to find the borders between coding and noncoding DNA regions. We have chosen 12and 18-symbol alphabets that captu...

2011
Kumar Sricharan Alfred O. Hero

Rényi entropy is an information-theoretic measure of randomness which is fundamental to several applications. Several estimators of Rényi entropy based on k-nearest neighbor (kNN) based distances have been proposed in literature. For d-dimensional densities f , the variance of these Rényi entropy estimators of f decay as O(M), whereM is the sample size drawn from f . On the other hand, the bias...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید