Random events entropy

Entropy is a measure of information for a random variable. Entropy is often presented in units of bits. Random events that are likely to happen carry fewer bits of information than random events that occur infrequently.

We choosed the following events:

A: It’s raining

B: A plane is landing

C: Someone is wearing a white hat

The probabilities to each event:

P(A) = 0.5

P(B) = 0.4

P(C) = 0.2

H(X)=−[P(A)⋅log2(P(A))+P(B)⋅log2 (P(B))+P(C)log 2(P(C)))

H(X)=−[(0.5⋅log2(0.5))+(0.4⋅log2(0.4))+(0.2⋅log2(0.2))

H(X)=−[(0.5⋅(−1))+(0.4⋅(−0.678))+(0.2⋅(−2.322))]

H(X)= −[(0.5⋅(−1))+(0.4⋅(−0.678))+(0.2⋅(−2.322))] H(X)=−[(−0.5)+(−0.2712)+(−0.4644)]

H(X)=−(−1.2356) H(X)≈1.2356

The entropy for these events is 1.2356 bits.