Let (X_{1}, X_{2}) be independent random varibales. X_{1} has mean 0 and variance 1, while X_{2} has mean 1 and variance 4. The mutual information I(X_{1 }; X_{2}) between X_{1} and X_{2} in bits is_______.

__Concept: __

Mutual information of two random variables is a measure to tell how much one random variable tells about the other.

It is mathematically defined as:

I(X_{1}, X_{2}) = H(X_{1}) – H(X_{1}/X_{2})

__Application__:

Since X_{1} and X_{2} are independent, we can write:

H(X_{1}/X_{2}) = H(X_{1})

I(X_{1},X_{2} ) = H(X_{1}) – H(X_{1})

= 0

Let (X_{1}, X_{2}) be independent random varibales. X_{1} has mean 0 and variance 1, while X_{2} has mean 1 and variance 4. The mutual information I(X_{1 }; X_{2}) between X_{1} and X_{2} in bits is_______.

__Concept: __

Mutual information of two random variables is a measure to tell how much one random variable tells about the other.

It is mathematically defined as:

I(X_{1}, X_{2}) = H(X_{1}) – H(X_{1}/X_{2})

__Application__:

Since X_{1} and X_{2} are independent, we can write:

H(X_{1}/X_{2}) = H(X_{1})

I(X_{1},X_{2} ) = H(X_{1}) – H(X_{1})

= 0

For the channel shown below if the source generates two symbols m_{0} and m_{1} with a probability of 0.6 and 0.4 respectively. The probability of error if the receiver uses MAP coding will be_______(correct up to two decimal places)

**If r _{0} is received:**

P(m_{0}) P(r_{0}/m_{0}) = 0.6 × 0.6 = 0.36

Also,

P(m_{1}) P(r0/m_{1}) = 0.4 × 0 = 0

**If r _{1} is received:**

P(m_{0}) P (r_{1}/m_{0}) = (0.6) (0.3)

= 0.18

P(m_{1}) P(r_{1}/m_{1})

= (0.4) (0.7)

= 0.28

**If r _{2} is received:**

P(m_{0}) P(r_{2}/m_{0})

(0.6) (0.1)

= 0.06

P(m_{1}) P(r_{2}/m_{1})

= (0.4) (0.3)

= 0.12

Probability of correct detection

P_{c} = 0.12 + 0.28 + 0.36

= 0.76

P_{e} = 1 – 0.76

For the channel shown below, if the source generates M_{0 }and M_{1} symbols. The probability of error using ML decoding is

If r_{0} is received

P (m_{0}) P (r_{0 }/ m_{0}) = 0.5 × 0.5 = 0.30

P (m_{1}) P (r_{0 }/ m_{1}) = 0.5 × 0 = 0

If r_{1 }is received

P (m_{1}) P(r_{1}/m_{0})

(0.5) (0.3)

= 0.15

P (m_{1}) P(r_{1}/m_{1})

(0.5) (0.7)

= 0.35

If r_{2} is received

P (m_{0}) P(r_{2}/m_{0})

= (0.5) (0.1)

= 0.05

P (m_{1}) P(r_{2}/m_{1})

(0.5) (0.3)

= 0.15

Probability of correct = 0.15 + 0.35 + 0.30

⇒ 0.80

Probability of error = 1 – 0.80

= 0.20

Consider a Binary - channel

P(x_{1}) = 0.5

P(x_{2}) = 0.5

Find the mutual Information in bits/symbol

P(y_{1}) = 3/8

P(y_{2}) = 5/8

Mutual Information I(xy)

I(xy) = H(x) - H(x/y)

= H(y) - H(y/x)

H(y) - Σ P(y_{j}) log2 P(y_{i})

= - [0.375 log_{2}(0.375) + 0.625 log_{2}(0.625)]

\(H\left( {\frac{y}{x}} \right) = - {\rm{\Sigma \Sigma }}P\left( {{x_i}{y_i}} \right){\log _2}P\left[ {\frac{{{y_j}}}{{{x_i}}}} \right]\;\)

\(= - \left[ {\frac{1}{4}{{\log }_2}\left( {\frac{1}{2}} \right) + \frac{1}{4}{{\log }_2}\left( {\frac{1}{2}} \right) + \frac{1}{8}{{\log }_2}\frac{1}{4} + \frac{3}{8}{{\log }_2}\frac{3}{4}} \right]\)

I(xy) = 0.05 bits/symbol

In data communication using error detection code, as soon as an error is detected, an automatic request for retransmission (ARQ) enables retransmission of data. such binary erasure channel can be modeled as shown:

If P = 0.2 and both symbols are generated with equal probability. Then mutual information I(x, y) is _______.

I (xy) = (1 - p) H (x)

\(\begin{array}{l} = \left( {1 - 0.2} \right)\left[ {\alpha \log \frac{1}{\alpha } + \left( {1 - \alpha } \right)\log \frac{1}{{\left( {1 - \alpha } \right)}}} \right]\\ = \left[ {0.8} \right]\left[ {\frac{1}{2}\log 2 + \frac{1}{2}\log 2} \right] \end{array}\)

= 0.4 [2 log 2] ( log used is base 2)

⇒ 0.8

A binary channel matrix is given by

Given, \({\rm{P}}\left( {{{\rm{x}}_1}} \right) = 1/3\) and \({\rm{P}}\left( {{{\rm{x}}_2}} \right) = 2/3\). The value of \({\rm{H}}\left( {\rm{Y}} \right)\) is ________bit/symbol.

The channel matrix can be represented as

Now \({\rm{P}}\left( {{{\rm{y}}_1}} \right) = {\rm{P}}\left( {\frac{{{{\rm{y}}_1}}}{{{{\rm{x}}_1}}}} \right){\rm{P}}\left( {{{\rm{x}}_1}} \right) + {\rm{P}}\left( {\frac{{{{\rm{y}}_1}}}{{{{\rm{x}}_2}}}} \right){\rm{P}}\left( {{{\rm{x}}_2}} \right)\)

\(\Rightarrow {\rm{P}}\left( {{{\rm{y}}_1}} \right) = \frac{2}{3} \cdot \frac{1}{3} + \frac{1}{{10}} \cdot \frac{2}{3} = \frac{{13}}{{45}}\)

And

\(\begin{array}{l} {\rm{P}}\left( {{{\rm{y}}_2}} \right) = {\rm{P}}\left( {\frac{{{{\rm{y}}_2}}}{{{{\rm{x}}_1}}}} \right){\rm{P}}\left( {{{\rm{x}}_1}} \right) + {\rm{P}}\left( {\frac{{{{\rm{y}}_2}}}{{{{\rm{x}}_2}}}} \right){\rm{P}}\left( {{{\rm{x}}_2}} \right)\\ \Rightarrow {\rm{P}}\left( {{{\rm{y}}_2}} \right) = \frac{1}{3} \cdot \frac{1}{3} + \frac{9}{{10}} \cdot \frac{2}{3} = \frac{{32}}{{45}} \end{array}\)

Now

\(\begin{array}{l} {\rm{H}}\left( {\rm{Y}} \right) = {\rm{P}}\left( {{{\rm{y}}_1}} \right){\log _2}\left( {\frac{1}{{{\rm{P}}\left( {{{\rm{y}}_1}} \right)}}} \right) + {\rm{P}}\left( {{{\rm{y}}_2}} \right){\log _2}\left( {\frac{1}{{{\rm{P}}\left( {{{\rm{y}}_2}} \right)}}} \right)\\ \Rightarrow {\rm{H}}\left( {\rm{Y}} \right) = \frac{{13}}{{45}}{\log _2}\left( {\frac{{45}}{{13}}} \right) + \frac{{32}}{{45}}{\log _2}\left( {\frac{{45}}{{32}}} \right)\\ \Rightarrow {\rm{H}}\left( {\rm{Y}} \right) = 0.8673\frac{{{\rm{bits}}}}{{{\rm{symbols}}}} \end{array}\)