komm.entropy
Computes the entropy of a random variable with a given pmf. Let $X$ be a random variable with pmf $p_X$ and alphabet $\mathcal{X}$. Its entropy is given by $$ \mathrm{H}(X) = \sum_{x \in \mathcal{X}} p_X(x) \log \frac{1}{p_X(x)}. $$ By default, the base of the logarithm is $2$, in which case the entropy is measured in bits. For more details, see CT06, Ch. 2.
Parameters:
-
pmf
(Array1D[float]
) –The probability mass function $p_X$ of the random variable. It must be a valid pmf, that is, all of its values must be non-negative and sum up to $1$.
-
base
(Optional[float | str]
) –The base of the logarithm to be used. It must be a positive float or the string
'e'
. The default value is2.0
.
Returns:
-
entropy
(float
) –The entropy $\mathrm{H}(X)$ of the random variable.
Examples:
>>> komm.entropy([1/4, 1/4, 1/4, 1/4])
np.float64(2.0)
>>> komm.entropy(pmf=[1/3, 1/3, 1/3], base=3.0)
np.float64(1.0)
>>> komm.entropy([0.5, 0.5], base='e')
np.float64(0.6931471805599453)