Quantcast
Viewing all articles
Browse latest Browse all 10

information theory, find entropy given Markov chain

There is an information source on the information source alphabet $ A = \{a, b, c\}$ represented by the state transition diagram below:

Image may be NSFW.
Clik here to view.
Markov chain

a) The random variable representing the $ i$ -th output from this information source is represented by $ X_i$ . It is known that the user is now in state $ S_1$ . In this state, let $ H (X_i|s_1)$ denote the entropy when observing the next symbol $ X_i$ , find the value of $ H (X_i|s_1)$ , entropy of this information source, Calculate $ H (X_i|X_{i-1}) $ and $ H (X_i)$ respectively. Assume $ i$ is quite large

How can I find $ H(X_i|s_1)?$ I know that $ $ H(X_i|s_1) = -\sum_{i,s_1} p\left(x_i, s_1\right)\cdot\log_b\!\left(p\left(x_i|s_1\right)\right) = -\sum_{i,j} p\left(x_i, s_1\right)\cdot\log_b\!\left(\frac{p\left(x_i, s_1\right)}{p\left(s_1\right)}\right)$ $ but I don’t know $ p(s_1)$ .

$ $ A=\begin{pmatrix}0.25 & 0.75 & 0\0.5 & 0 & 0.5 \0 & 0.7 & 0.3 \end{pmatrix}.$ $

From matrix I can know that $ p(s_1|s_1)=0.25$ , etc.

But what is the probability of $ s_1$ ? And how can I calculate $ H (X_i|X_{i-1})$ ?

The post information theory, find entropy given Markov chain appeared first on 100% Private Proxies - Fast, Anonymous, Quality, Unlimited USA Private Proxy!.


Viewing all articles
Browse latest Browse all 10

Trending Articles