Relative Entropy Rate between a Markov Chain and Its Corresponding Hidden Markov Chain

Authors

Abstract:

 In this paper we study the relative entropy rate between a homogeneous Markov chain and a hidden Markov chain defined by observing the output of a discrete stochastic channel whose input is the finite state space homogeneous stationary Markov chain. For this purpose, we obtain the relative entropy between two finite subsequences of above mentioned chains with the help of the definition of relative entropy between two random variables then we define the relative entropy rate between these stochastic processes and study the convergence of it.

Upgrade to premium to download articles

Sign up to access the full text

Already have an account?login

similar resources

ADK Entropy and ADK Entropy Rate in Irreducible- Aperiodic Markov Chain and Gaussian Processes

In this paper, the two parameter ADK entropy, as a generalized of Re'nyi entropy, is considered and some properties of it, are investigated. We will see that the ADK entropy for continuous random variables is invariant under a location and is not invariant under a scale transformation of the random variable. Furthermore, the joint ADK entropy, conditional ADK entropy, and chain rule of this ent...

full text

Taylor Expansion for the Entropy Rate of Hidden Markov Chains

We study the entropy rate of a hidden Markov process, defined by observing the output of a symmetric channel whose input is a first order Markov process. Although this definition is very simple, obtaining the exact amount of entropy rate in calculation is an open problem. We introduce some probability matrices based on Markov chain's and channel's parameters. Then, we try to obtain an estimate ...

full text

Markov Chain and Hidden Markov Model

Figure 1. Finite state machine for a Markov chain X0 → X1 → X2 → · · · → Xn where the random variables Xi’s take values from I = {S1, S2, S3}. The numbers T (i, j)’s on the arrows are the transition probabilities such that Tij = P (Xt+1 = Sj|Xt = Si). Definition 1.2. We say that (Xn)n≥0 is a Markov chain with initial distribution λ and transition matrix T if (i) X0 has distribution λ; (ii) for ...

full text

Relative entropy between Markov transition rate matrices

We derive the relative entropy between two Markov transition rate matrices from sample path considerations. This relative entropy is interpreted as a \level 2.5" large deviations action functional. That is, the level two large deviations action functional for empirical distributions of continuous-time Markov chains can be derived from the relative entropy using the contraction mapping principle...

full text

The Relative Entropy Rate For Two Hidden Markov Processes

The relative entropy rate is a natural and useful measure of distance between two stochastic processes. In this paper we study the relative entropy rate between two Hidden Markov Processes (HMPs), which is of both theoretical and practical importance. We give new results showing analyticity, representation using Lyapunov exponents, and Taylor expansion for the relative entropy rate of two discr...

full text

My Resources

Save resource for easier access later

Save to my library Already added to my library

{@ msg_add @}


Journal title

volume 8  issue 1

pages  97- 110

publication date 2011-09

By following a journal you will be notified via email when a new issue of this journal is published.

Keywords

Hosted on Doprax cloud platform doprax.com

copyright © 2015-2023