Entropy Estimate for Maps on Forests

author

  • M. Sabbaghan
Abstract:

A 1993 result of J. Llibre, and M. Misiurewicz, (Theorem A [5]), states that if a continuous map f of a graph into itself has an s-horseshoe, then the topological entropy of f is greater than or equal to logs, that is h( f ) ? logs. Also a 1980 result of L.S. Block, J. Guckenheimer, M. Misiurewicz and L.S. Young (Lemma 1.5 [3]) states that if G is an A-graph of f then h(G) ? h( f ). In this paper we generalize Theorem A and Lemma 1.5 for continuous functions on forests. Let F be a forest and f : F?F be a continuous function. By using the adjacency matrix of a graph, we give a lower bound for the topological entropy of f.

Upgrade to premium to download articles

Sign up to access the full text

Already have an account?login

similar resources

entropy estimate for maps on forests

a 1993 result of j. llibre, and m. misiurewicz, (theorem a [5]), states that if a continuous map f of a graph into itself has an s-horseshoe, then the topological entropy of f is greater than or equal to logs, that is h( f ) ? logs. also a 1980 result of l.s. block, j. guckenheimer, m. misiurewicz and l.s. young (lemma 1.5 [3]) states that if g is an a-graph of f then h(g) ? h( f ). in this pap...

full text

Entropy Estimate For High Dimensional Monotonic Functions

We establish upper and lower bounds for the metric entropy and bracketing entropy of the class of d-dimensional bounded monotonic functions under L norms. It is interesting to see that both the metric entropy and bracketing entropy have different behaviors for p < d/(d − 1) and p > d/(d − 1). We apply the new bounds for bracketing entropy to establish a global rate of convergence of the MLE of ...

full text

Do Hebbian synapses estimate entropy?

Hebbian learning is one of the mainstays of biologically inspired neural processing. Hebb’s rule is biologically plausible, and it has been extensively utilized in both computational neuroscience and in unsupervised training of neural systems. In these fields, Hebbian learning became synonymous for correlation learning. But it is known that correlation is a second order statistic of the data, s...

full text

Pre-image Entropy for Maps on Noncompact Topological Spaces

We propose a new definition of pre-image entropy for continuous maps on noncompact topological spaces, investigate fundamental properties of the new pre-image entropy, and compare the new pre-image entropy with the existing ones. The defined pre-image entropy generates that of Cheng and Newhouse. Yet, it holds various basic properties of Cheng and Newhouse’s pre-image entropy, for example, the ...

full text

My Resources

Save resource for easier access later

Save to my library Already added to my library

{@ msg_add @}


Journal title

volume 21  issue 1

pages  -

publication date 2010-03-01

By following a journal you will be notified via email when a new issue of this journal is published.

Hosted on Doprax cloud platform doprax.com

copyright © 2015-2023