Shannon information entropy
Definition: Shannon information entropy
The Shannon information entropy $H$ of a probability distribution $p(x)$ is defined as $$H = -\sum_x p(x)\log p(x) = E[-\log p(x)]$$ It can be interpreted as the “uncertainty” in a probability distribution