Consider the set M_+^1(A) of probability distributions where A is a set provided with some
σ-algebra of measurable subsets. In particular we can take A to be a finite or
countable set with all subsets being measurable. The Jensen–Shannon divergence (JSD) is a symmetrized and smoothed version of the
Kullback–Leibler divergence D(P \parallel Q). It is defined by :{\rm JSD}(P \parallel Q)= \frac{1}{2}D(P \parallel M)+\frac{1}{2}D(Q \parallel M), where M=\frac{1}{2}(P+Q) is a
mixture distribution of P and Q. The geometric Jensen–Shannon divergence (or G-Jensen–Shannon divergence) yields a closed-form formula for divergence between two Gaussian distributions by taking the geometric mean. A more general definition, allowing for the comparison of more than two probability distributions, is: : \begin{align} {\rm JSD}_{\pi_1, \ldots, \pi_n}(P_1, P_2, \ldots, P_n) &= \sum_i \pi_i D( P_i \parallel M ) \\ &= H\left(M\right) - \sum_{i=1}^n \pi_i H(P_i) \end{align} where \begin{align} M &:= \sum_{i=1}^n \pi_i P_i \end{align} and \pi_1, \ldots, \pi_n are weights that are selected for the probability distributions P_1, P_2, \ldots, P_n, and H(P) is the
Shannon entropy for distribution P. For the two-distribution case described above, P_1=P, P_2=Q, \pi_1 = \pi_2 = \frac{1}{2}.\ Hence, for those distributions P, Q JSD = H(M) - \frac{1}{2}\bigg(H(P) + H(Q)\bigg) ==Bounds==