The discriminability index is the separation between the means of two distributions (typically the signal and the noise distributions), in units of the
standard deviation.
Equal variances/covariances For two
univariate distributions a and b with the same standard deviation, it is denoted by d' ('dee-prime'): : d' = \frac{\left\vert \mu_a - \mu_b \right\vert}{\sigma}. In higher dimensions, i.e. with two multivariate distributions with the same variance-covariance matrix \mathbf{\Sigma}, (whose symmetric square-root, the standard deviation matrix, is \mathbf{S}), this generalizes to the
Mahalanobis distance between the two distributions: : d'=\sqrt{(\boldsymbol{\mu}_a-\boldsymbol{\mu}_b)'\mathbf{\Sigma}^{-1}(\boldsymbol{\mu}_a-\boldsymbol{\mu}_b)} = \lVert \mathbf{S}^{-1}(\boldsymbol{\mu}_a-\boldsymbol{\mu}_b) \rVert = \lVert \boldsymbol{\mu}_a-\boldsymbol{\mu}_b \rVert /\sigma_{\boldsymbol{\mu}}, where \sigma_{\boldsymbol{\mu}} = 1/ \lVert\mathbf{S}^{-1}\boldsymbol{\mu}\rVert is the 1d slice of the sd along the unit vector \boldsymbol{\mu} through the means, i.e. the d' equals the d' along the 1d slice through the means. For two bivariate distributions with equal variance-covariance, this is given by: : {d'}^2 =\frac{1}{1-\rho^2} \left({d'}^2_x+{d'}^2_y-2\rho {d'}_x {d'}_y \right), where \rho is the correlation coefficient, and here d'_x=\frac{{\mu_b}_x-{\mu_a}_x}{\sigma_x} and d'_y=\frac{{\mu_b}_y-{\mu_a}_y}{\sigma_y}, i.e. including the signs of the mean differences instead of the absolute. d' is also estimated as Z(\text{hit rate})-Z(\text{false alarm rate}).
Unequal variances/covariances When the two distributions have different standard deviations (or in general dimensions, different covariance matrices), there exist several contending indices, all of which reduce to d' for equal variance/covariance.
Bayes discriminability index This is the maximum (Bayes-optimal) discriminability index for two distributions, based on the amount of their overlap, i.e. the optimal (Bayes) error of classification e_b by an ideal observer, or its complement, the optimal accuracy a_b: : d'_b=-2Z\left(\text{Bayes error rate } e_b\right)=2Z\left(\text{best accuracy rate } a_b\right), where Z is the inverse
cumulative distribution function of the standard normal. The Bayes discriminability between univariate or multivariate normal distributions can be numerically computed (Matlab code), and may also be used as an approximation when the distributions are close to normal. d'_b is a positive-definite statistical distance measure that is free of assumptions about the distributions, like the
Kullback–Leibler divergence D_\text{KL}. D_\text{KL}(a,b) is asymmetric, whereas d'_b(a,b) is symmetric for the two distributions. However, d'_b does not satisfy the
triangle inequality, so it is not a full metric. In particular, for a yes/no task between two univariate normal distributions with means \mu_a,\mu_b and variances v_a>v_b, the Bayes-optimal classification accuracies are: : p(A|a)=p({\chi'}^2_{1,v_a \lambda} > v_b c), \; \; p(B|b)=p({\chi'}^2_{1,v_b \lambda} , where \chi'^2 denotes the
non-central chi-squared distribution, \lambda=\left(\frac{\mu_a-\mu_b}{v_a-v_b}\right)^2, and c=\lambda+\frac{\ln v_a -\ln v_b}{v_a-v_b}. The Bayes discriminability d'_b=2Z\left(\frac{p\left(A|a\right)+p\left(B|b\right)}{2} \right). d'_b can also be computed from the
ROC curve of a yes/no task between two univariate normal distributions with a single shifting criterion. It can also be computed from the ROC curve of any two distributions (in any number of variables) with a shifting likelihood-ratio, by locating the point on the ROC curve that is farthest from the diagonal. For a two-interval task between these distributions, the optimal accuracy is a_b=p \left( \tilde{\chi}^2_{\boldsymbol{w}, \boldsymbol{k}, \boldsymbol{\lambda},0,0}>0 \right) (\tilde{\chi}^2 denotes the
generalized chi-squared distribution), where \boldsymbol{w}=\begin{bmatrix} \sigma_s^2 & -\sigma_n^2 \end{bmatrix}, \; \boldsymbol{k}=\begin{bmatrix} 1 & 1 \end{bmatrix}, \; \boldsymbol{\lambda}=\frac{\mu_s-\mu_n}{\sigma_s^2-\sigma_n^2} \begin{bmatrix} \sigma_s^2 & \sigma_n^2 \end{bmatrix}. The Bayes discriminability d'_b=2Z\left(a_b\right).
RMS sd discriminability index A common approximate (i.e. sub-optimal) discriminability index that has a closed-form is to take the average of the variances, i.e. the rms of the two standard deviations: d'_a=\left\vert \mu_a -\mu_b \right\vert/\sigma_\text{rms} (also denoted by d_a). It is \sqrt{2} times the z-score of the area under the
receiver operating characteristic curve (AUC) of a single-criterion observer. This index is extended to general dimensions as the Mahalanobis distance using the pooled covariance, i.e. with \mathbf{S}_\text{rms}=\left[\left(\mathbf{\Sigma}_a+\mathbf{\Sigma}_b\right)/2 \right]^\frac{1}{2} as the common sd matrix.
Average sd discriminability index Another index is d'_e=\left\vert \mu_a -\mu_b \right\vert/\sigma_\text{avg}, extended to general dimensions using \mathbf{S}_\text{avg}=\left(\mathbf{S}_a+\mathbf{S}_b\right)/2 as the common sd matrix. ==Contribution to discriminability by each dimension==