Dual total correlation is non-negative and bounded above by the joint entropy H(X_1, \ldots, X_n). : 0 \leq D(X_1, \ldots, X_n) \leq H(X_1, \ldots, X_n) . Secondly, Dual total correlation has a close relationship with total correlation, C(X_1, \ldots, X_n), and can be written in terms of differences between the total correlation of the whole, and all subsets of size N-1: : D(\textbf{X}) = (N-1)C(\textbf{X}) - \sum_{i=1}^{N} C(\textbf{X}^{-i}) where \textbf{X} = \{X_1,\ldots, X_n\} and \textbf{X}^{-i} = \{X_1,\ldots, X_{i-1}, X_{i+1},\ldots, X_n\} Furthermore, the total correlation and dual total correlation are related by the following bounds: : \frac{C(X_1, \ldots, X_n)}{n-1} \leq D(X_1, \ldots, X_n) \leq (n-1) \; C(X_1, \ldots, X_n) . Finally, the difference between the total correlation and the dual total correlation defines a novel measure of higher-order information-sharing: the O-information: :\Omega(\textbf{X}) = C(\textbf{X}) - D(\textbf{X}) . The O-information (first introduced as the "enigmatic information" by James and Crutchfield is a
signed measure that quantifies the extent to which the information in a
multivariate random variable is dominated by synergistic interactions (in which case \Omega(\textbf{X})) or redundant interactions (in which case \Omega(\textbf{X}) > 0, and have found multiple applications in neuroscience. ==History==