Data is any content of any data carrier. (“aAm58’*”)
Message is data, which respect the defined syntactic rules (picture JPG coding, hungarian language, …). (“It is cold”)
Information is a message, which lowers the behavioral entropy of the addressed subject. (“It is cold outside”)
Knowledge is an contextual information, which contains some rules or recommended steps, how to achieve results needed. (“If it is cold outside, then taking a cap is recommended.”)
Let \(\mathcal A\) be a set of random events. Random event is an event, which do not happen with sure, but we know the rules of its happening.
If \(A\) is a random event, then
Probability is a function \(P: \mathcal A \rightarrow [0,1]\), where
Entropy is measured in bits and the formula is given as follows:
\[H = - \sum_{A \in \mathcal A} p(A) \log_2(p(A))\]
If the coin is well balanced, then by its tossing probability of head is \(P(H)=0.5\) and of a tail is \(P(T)=0.5\).
Then, the entropy of the tossing the coint is \[H = -0.5 log_2(0.5) - -0.5 log_2(0.5)= -0.5 (-0.5) - 0.5 (-0.5) = 0.5\] The system is unpredictable - entropy is maximal. Suppose, the coin has both sides denoted as tails - the tail falls with certainty (\(P(H)=0\),\(P(T)=1\)). Then the entropy is
\[H = -1 log_2(1) - 0 \lim_{p(H) \rightarrow 0^+}log_2(p(H))= 0 - 0 = 0\] and entropy decreases. The higher entropy, the more unpredictable system of random events.