نتایج جستجو برای: entropy model

تعداد نتایج: 2154089  

2011
Elliott Sober Mike Steel

Markov models of evolution describe changes in the probability distribution of the trait values a population might exhibit. In consequence, they also describe how entropy and conditional entropy values evolve, and how the mutual information that characterizes the relation between an earlier and a later moment in a lineage’s history depends on how much time separates them. These models therefore...

2008
Jean-François Bercher

Distributions derived from the maximization of Rényi-Tsallis entropy are often called Tsallis’ distributions. We first indicate that these distributions can arise as mixtures, and can be interpreted as the solution of a standard maximum entropy problem with fluctuating constraints. Considering that Tsallis’ distributions appear for systems with displaced or fluctuating equilibriums, we show tha...

2014
Mihály Ormos Dávid Zibriczky

We investigate entropy as a financial risk measure. Entropy explains the equity premium of securities and portfolios in a simpler way and, at the same time, with higher explanatory power than the beta parameter of the capital asset pricing model. For asset pricing we define the continuous entropy as an alternative measure of risk. Our results show that entropy decreases in the function of the n...

2001
Bernhard Baumgartner

Consider a state of a system with several subsystems. The entropies of the reduced state on different subsystems obey certain inequalities, provided there is an equivalence relation, and a function measuring volumes or weights of subsystems. The entropy per unit volume or unit weight, the mean entropy, is then decreasing with respect to an order relation of the subsystems, defined in this paper...

2007
JÉRÔME BUZZI

We study the dynamics of piecewise affine surface homeomorphisms from the point of view of their entropy. Under the assumption of positive topological entropy, we establish the existence of finitely many ergodic and invariant probability measures maximizing entropy and prove a multiplicative lower bound for the number of periodic points. This is intended as a step towards the understanding of s...

2006
Astrid Zeman Mikhail Prokopenko

This paper investigates cluster formation in decentralized sensor grids and focusses on predicting when the cluster formation converges to a stable configuration. The traffic volume of inter-agent communications is used, as the underlying time series, to construct a predictor of the convergence time. The predictor is based on the assumption that decentralized cluster formation creates multiagen...

2009
Alejandro Vega Nigel G. Ward

The entropy constancy principle describes the tendency for information in language to be conveyed at a constant rate. We explore the possible role of this principle in spoken dialog, using the “summed entropy rate,” that is, the sum of the entropies of the words of both speakers per second of time. Using the Switchboard corpus of casual dialogs and a standard ngram language model to estimate en...

Journal: :J. Applied Probability 2016
Zdravko I. Botev Ad Ridder Leonardo Rojas-Nandayapa

The Cross Entropy method is a well-known adaptive importance sampling method for rare-event probability estimation, which requires estimating an optimal importance sampling density within a parametric class. In this article we estimate an optimal importance sampling density within a wider semiparametric class of distributions. We show that this semiparametric version of the Cross Entropy method...

2010
Christopher A. Zapart

The paper describes an alternative approach to forecasting financial time series based on entropy (C. A. Zapart, On entropy, financial markets and minority games, Physica A: Statistical Mechanics and its Applications, 388 (7) 2009, pages 1157-1172). The research builds upon an earlier statistical analysis of financial time series with Shannon information entropy, published in (Molgedey, L and E...

The Rényi entropy is a generalization of Shannon entropy to a one-parameter family of entropies. Tsallis entropy too is a generalization of Shannon entropy. The measure for Tsallis entropy is non-logarithmic. After the introduction of Shannon entropy , the conditional Shannon entropy was derived and its properties became known. Also, for Tsallis entropy, the conditional entropy was introduced a...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید