Mutual Information

Mutual information is a measure of how much information one random variable provides about another. It is a dimensionless quantity and reflects the reduction in uncertainty about one variable given knowledge of the other.

Efficient communication systems aim for high mutual information.


Joint Entropies

Using: - Input probabilities \(P(x_i)\) - Output probabilities \(P(y_j)\) - Transition probabilities \(P(y_j|x_i)\) - Joint probabilities \(P(x_i, y_j)\)

We define:


Interpretation of Joint Entropy


Conditional Entropy

Formulas:


Useful Identities

These identities mirror relationships found in general probability theory.


Mutual Information

Mutual Information \(I(X; Y)\) quantifies the reduction in entropy of \(X\) given that \(Y\) is known. It can be expressed in several equivalent ways:

All expressions are measured in bits per symbol.


```

Let me know if you’d like this adapted to a more visual layout or converted into LaTeX code blocks.