Persistence exponents in Markov chains
نویسندگان
چکیده
Nous démontrons l’existence de l’exposant persistance log?:=limn?? 1 nlog P?(X0?S,…,Xn?S) pour une classe chaines Markov {Xi}i?0 homogènes en temps avec valeurs dans un espace Polonais, où S est ensemble Borélien et ? distribution initiale. En nous concentrant sur le cas processus type AR(p) ou MA(q) p,q?N d’innovation continue, étudions ? sa continuité par rapport au paramètres des AR MA, S=R?0. Pour ayant d’innovations qui log-concave, la monotonicité stricte ?. Finalement, calculons explicitement les exposants quelques exemples concrets.
منابع مشابه
Empirical Bayes Estimation in Nonstationary Markov chains
Estimation procedures for nonstationary Markov chains appear to be relatively sparse. This work introduces empirical Bayes estimators for the transition probability matrix of a finite nonstationary Markov chain. The data are assumed to be of a panel study type in which each data set consists of a sequence of observations on N>=2 independent and identically dis...
متن کاملMarkov chains
[Tip: Study the MC, QT, and Little's law lectures together: CTMC (MC lecture), M/M/1 queue (QT lecture), Little's law lecture (when deriving the mean response time from mean number of customers), DTMC (MC lecture), M/M/1 queue derivation using DTMC analysis, derive distribution of response time in M/M/1 queue (QT lecture), relation between Markov property and mem-oryless property (MC lecture), ...
متن کاملLoss of Memory of Random Functions of Markov Chains and Lyapunov Exponents
In this paper we prove that the asymptotic rate of exponential loss of memory of a random function of a Markov chain (Zt)t∈Z is bounded above by the difference of the first two Lyapunov exponents of a certain product of matrices. We also show that this bound is in fact realized, namely for almost all realization of the process (Zt)t∈Z, we can find symbols where the asymptotic exponential rate o...
متن کاملTaylor Expansion for the Entropy Rate of Hidden Markov Chains
We study the entropy rate of a hidden Markov process, defined by observing the output of a symmetric channel whose input is a first order Markov process. Although this definition is very simple, obtaining the exact amount of entropy rate in calculation is an open problem. We introduce some probability matrices based on Markov chain's and channel's parameters. Then, we try to obtain an estimate ...
متن کاملThe Rate of Rényi Entropy for Irreducible Markov Chains
In this paper, we obtain the Rényi entropy rate for irreducible-aperiodic Markov chains with countable state space, using the theory of countable nonnegative matrices. We also obtain the bound for the rate of Rényi entropy of an irreducible Markov chain. Finally, we show that the bound for the Rényi entropy rate is the Shannon entropy rate.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Annales de l'I.H.P
سال: 2021
ISSN: ['0246-0203', '1778-7017']
DOI: https://doi.org/10.1214/20-aihp1114