Skip to content

komm.relative_entropy

Computes the relative entropy (Kullback–Leibler divergence) between two pmfs. Let $p$ and $q$ be two pmfs over the same alphabet $\mathcal{X}$. The relative entropy of $p$ with respect to $q$ is defined as $$ \mathrm{D}(p || q) = \sum_{x \in \mathcal{X}} p(x) \log \frac{p(x)}{q(x)}. $$ Note that, in general, $\mathrm{D}(p || q) \neq \mathrm{D}(q || p)$. For more details, see CT06, Sec. 2.3.

Parameters:

  • pmf (ArrayLike)

    The probability mass function $p$. It must be a valid pmf, that is, all of its values must be non-negative and sum up to $1$.

  • qmf (ArrayLike)

    The probability mass function $q$. It must be a valid pmf, that is, all of its values must be non-negative and sum up to $1$.

  • base (LogBase)

    The base of the logarithm to be used. It must be a positive float or the string 'e'. The default value is 2.0.

Returns:

  • float

    The relative entropy $\mathrm{D}(p || q)$ between the two pmfs.

Examples:

>>> komm.relative_entropy([1/2, 1/2], [1/2, 1/2])
np.float64(0.0)
>>> komm.relative_entropy([1/2, 1/2], [3/4, 1/4])
np.float64(0.20751874963942185)
>>> komm.relative_entropy([3/4, 1/4], [1/2, 1/2])
np.float64(0.18872187554086717)
>>> komm.relative_entropy([1/2, 1/2], [0, 1])
np.float64(inf)