Tom's Kopfbahnhof

Search

Search IconIcon to open search

Shannon information entropy

Last updated Dec 29, 2024

Definition: Shannon information entropy

The Shannon information entropy $H$ of a probability distribution $p(x)$ is defined as $$H = -\sum_x p(x)\log p(x) = E[-\log p(x)]$$ It can be interpreted as the “uncertainty” in a probability distribution