Lemma
math, backwards

entropy

ko · counterpart 엔트로피

`H(X) = −Σ pᵢ log₂ pᵢ`. The expected number of yes/no questions needed, on average, to identify which outcome occurred. Reaches `log₂ N` when all `N` outcomes are equally likely (maximum); collapses to 0 when one outcome has probability 1 (no uncertainty). The base of the log picks the unit: log₂ → bits, ln → nats, log₁₀ → bans. Built on `log` so that the entropy of independent variables adds: `H(X, Y) = H(X) + H(Y)` when X and Y are independent.

invented

1948 · Claude Shannon · Bell Labs, New Jersey

Shannon's *A Mathematical Theory of Communication* (1948) defined information itself, in bits, as the average number of yes/no questions needed to identify an outcome. Cryptanalysis at Bell Labs needed a quantitative answer to 'how much information is in this signal'; the answer ended up founding the entire field of information theory.

en.wikipedia.org/wiki/A_Mathematical_Theory_of_Communication ↗

related
used on · 1