MarketQuantum contextuality
Company Profile

Quantum contextuality

Quantum contextuality is a feature of the phenomenology of quantum mechanics whereby measurements of quantum observables cannot simply be thought of as revealing pre-existing values. Any attempt to do so in a realistic hidden-variable theory leads to values that are dependent upon the choice of the other (compatible) observables which are simultaneously measured. More formally, the measurement result of a quantum observable is dependent upon which other commuting observables are within the same measurement set.

Kochen and Specker
The need for contextuality was discussed informally in 1935 by Grete Hermann, but it was more than 30 years later when Simon B. Kochen and Ernst Specker, and separately John Bell, constructed proofs that any realistic hidden-variable theory able to explain the phenomenology of quantum mechanics is contextual for systems of Hilbert space dimension three and greater. The Kochen–Specker theorem proves that realistic noncontextual hidden-variable theories cannot reproduce the empirical predictions of quantum mechanics. Such a theory would suppose the following. • All quantum-mechanical observables may be simultaneously assigned definite values (this is the realism postulate, which is false in standard quantum mechanics, since there are observables that are indefinite in every given quantum state). These global value assignments may deterministically depend on some "hidden" classical variable, which in turn may vary stochastically for some classical reason (as in statistical mechanics). The measured assignments of observables may therefore finally stochastically change. This stochasticity is, however, epistemic and not ontic, as in the standard formulation of quantum mechanics. • Value assignments pre-exist and are independent of the choice of any other observables, which, in standard quantum mechanics, are described as commuting with the measured observable, and they are also measured. • Some functional constraints on the assignments of values for compatible observables are assumed (e.g., they are additive and multiplicative, there are, however, several versions of this functional requirement). In addition, Kochen and Specker constructed an explicitly noncontextual hidden-variable model for the two-dimensional qubit case in their paper on the subject, thereby completing the characterisation of the dimensionality of quantum systems that can demonstrate contextual behaviour. Bell's proof invoked a weaker version of Gleason's theorem, reinterpreting the theorem to show that quantum contextuality exists only in Hilbert space dimension greater than two. == Frameworks for contextuality ==
Frameworks for contextuality
Sheaf-theoretic framework The sheaf-theoretic, or Abramsky–Brandenburger, approach to contextuality initiated by Samson Abramsky and Adam Brandenburger is theory-independent and can be applied beyond quantum theory to any situation in which empirical data arises in contexts. As well as being used to study forms of contextuality arising in quantum theory and other physical theories, it has also been used to study formally equivalent phenomena in logic, relational databases, natural language processing, and constraint satisfaction. In essence, contextuality arises when empirical data is locally consistent but globally inconsistent. This framework gives rise in a natural way to a qualitative hierarchy of contextuality: • (Probabilistic) contextuality may be witnessed in measurement statistics, e.g. by the violation of an inequality. A representative example is the KCBS proof of contextuality. • Logical contextuality may be witnessed in the "possibilistic" information about which outcome events are possible and which are not possible. A representative example is Hardy's nonlocality proof of nonlocality. • Strong contextuality is a maximal form of contextuality. Whereas (probabilistic) contextuality arises when measurement statistics cannot be reproduced by a mixture of global value assignments, strong contextuality arises when no global value assignment is even compatible with the possible outcome events. A representative example is the original Kochen–Specker proof of contextuality. Each level in this hierarchy strictly includes the next. An important intermediate level that lies strictly between the logical and strong contextuality classes is all-versus-nothing contextuality, Within this framework experimental scenarios are described by graphs, and certain invariants of these graphs were shown have particular physical significance. One way in which contextuality may be witnessed in measurement statistics is through the violation of noncontextuality inequalities (also known as generalized Bell inequalities). With respect to certain appropriately normalised inequalities, the independence number, Lovász number, and fractional packing number of the graph of an experimental scenario provide tight upper bounds on the degree to which classical theories, quantum theory, and generalised probabilistic theories, respectively, may exhibit contextuality in an experiment of that kind. A more refined framework based on hypergraphs rather than graphs is also used. developed by Ehtibar Dzhafarov, Janne Kujala, and colleagues, (non)contextuality is treated as a property of any system of random variables, defined as a set \mathcal{R} = \{R_q^c : q \in Q, q \prec c, c \in C\} in which each random variable R_q^c is labeled by its content q the property it measures, and its context c the set of recorded circumstances under which it is recorded (including but not limited to which other random variables it is recorded together with); q \prec c stands for "q is measured in c". The variables within a context are jointly distributed, but variables from different contexts are stochastically unrelated, defined on different sample spaces. A (probabilistic) coupling of the system \mathcal{R} is defined as a system S in which all variables are jointly distributed and, in any context c, R^c = \{R_q^c : q \in Q, q \prec c\} and S^c = \{S_q^c : q \in Q, q \prec c\} are identically distributed. The system is considered noncontextual if it has a coupling S such that the probabilities \Pr[S_q^c = S_q^{c'}] are maximal possible for all contexts c, c' and contents q such that q \prec c, c'. If such a coupling does not exist, the system is contextual. For the important class of cyclic systems of dichotomous (\pm1) random variables, \mathcal{C}_n = \big\{(R_1^1, R_2^1), (R_2^2, R_3^2), \ldots, (R_n^n, R_1^n)\big\} (n \geq 2), it has been shown that such a system is noncontextual if and only if D (\mathcal{C}_n) \leq \Delta(\mathcal{C}_n), where \Delta(\mathcal{C}_n) = (n - 2) + |R_1^1 - R_1^n| + |R_2^1 - R_2^2| + \ldots + |R_n^{n-1} - R_n^n|, and D(\mathcal{C}_n) = \max\big(\lambda_1 \langle R_1^1 R_2^1\rangle + \lambda_2 \langle R_2^2 R_3^2\rangle + \ldots + \lambda_n \langle R_n^n R_1^n\rangle\big), with the maximum taken over all \lambda_i = \pm1 whose product is -1. If R_q^c and R_q^{c'}, measuring the same content in different context, are always identically distributed, the system is called consistently connected (satisfying "no-disturbance" or "no-signaling" principle). Except for certain logical issues, That nonlocality is a special case of contextuality follows in CbD from the fact that being jointly distributed for random variables is equivalent to being measurable functions of one and the same random variable (this generalizes Arthur Fine's analysis of Bell's theorem). CbD essentially coincides with the probabilistic part of Abramsky's sheaf-theoretic approach if the system is strongly consistently connected, which means that the joint distributions of \{R_{q_1}^c, \ldots, R_{q_k}^c\} and \{R_{q_1}^{c'}, \ldots, R_{q_k}^{c'}\} coincide whenever q_1, \ldots, q_k are measured in contexts c, c'. However, unlike most approaches to contextuality, CbD allows for inconsistent connectedness, with R_q^c and R_q^{c'} differently distributed. This makes CbD applicable to physics experiments in which no-disturbance condition is violated, as well as to human behavior where this condition is violated as a rule. In particular, Victor Cervantes, Ehtibar Dzhafarov, and colleagues have demonstrated that random variables describing certain paradigms of simple decision making form contextual systems, whereas many other decision-making systems are noncontextual once their inconsistent connectedness is properly taken into account. With respect to measurements, it removes the assumption of determinism of value assignments that is present in standard definitions of contextuality. This breaks the interpretation of nonlocality as a special case of contextuality, and does not treat irreducible randomness as nonclassical. Nevertheless, it recovers the usual notion of contextuality when outcome determinism is imposed. Spekkens' contextuality can be motivated using Leibniz's law of the identity of indiscernibles. The law applied to physical systems in this framework mirrors the entended definition of noncontextuality. This was further explored by Simmons et al, who demonstrated that other notions of contextuality could also be motivated by Leibnizian principles, and could be thought of as tools enabling ontological conclusions from operational statistics. Extracontextuality and extravalence Given a pure quantum state |\psi \rangle, Born's rule tells that the probability to obtain another state | \phi \rangle in a measurement is | \langle \phi | \psi \rangle|^2. However, such a number does not define a full probability distribution, i.e. values over a set of mutually exclusive events, summing up to 1. In order to obtain such a set one needs to specify a context, that is a complete set of commuting operators (CSCO), or equivalently a set of N orthogonal projectors | \phi_n \rangle \langle \phi_n | that sum to identity, where N is the dimension of the Hilbert space. Then one has \sum_n | \langle \phi_n | \psi \rangle|^2 = 1 as expected. In that sense, one can tell that a state vector | \psi \rangle alone is predictively incomplete, as long a context has not been specified. The actual physical state, now defined by | \phi_n \rangle within a specified context, has been called a modality by Auffèves and Grangier Since it is clear that | \psi \rangle alone does not define a modality, what is its status ? If N \geq 3, one sees easily that | \psi \rangle is associated with an equivalence class of modalities, belonging to different contexts, but connected between themselves with certainty, even if the different CSCO observables do not commute. This equivalence class is called an extravalence class, and the associated transfer of certainty between contexts is called extracontextuality. As a simple example, the usual singlet state for two spins 1/2 can be found in the (non commuting) CSCOs associated with the measurement of the total spin (with S=0, \; m=0), or with a Bell measurement, and actually it appears in infinitely many different CSCOs - but obviously not in all possible ones. The concepts of extravalence and extracontextuality are very useful to spell out the role of contextuality in quantum mechanics, that is not non-contextual (like classical physical would be), but not either fully contextual, since modalities belonging to incompatible (non-commuting) contexts may be connected with certainty. Starting now from extracontextuality as a postulate, the fact that certainty can be transferred between contexts, and is then associated with a given projector, is the very basis of the hypotheses of Gleason's theorem, and thus of Born's rule. Also, associating a state vector with an extravalence class clarifies its status as a mathematical tool to calculate probabilities connecting modalities, which correspond to the actual observed physical events or results. This point of view is quite useful, and it can be used everywhere in quantum mechanics. Other frameworks and extensions A form of contextuality that may present in the dynamics of a quantum system was introduced by Shane Mansfield and Elham Kashefi, and has been shown to relate to computational quantum advantages. As a notion of contextuality that applies to transformations it is inequivalent to that of Spekkens. Examples explored to date rely on additional memory constraints which have a more computational than foundational motivation. Contextuality may be traded-off against Landauer erasure to obtain equivalent advantages. == Fine's theorem ==
Fine's theorem
The Kochen–Specker theorem proves that quantum mechanics is incompatible with realistic noncontextual hidden variable models. On the other hand Bell's theorem proves that quantum mechanics is incompatible with factorisable hidden variable models in an experiment in which measurements are performed at distinct spacelike separated locations. Arthur Fine showed that in the experimental scenario in which the famous CHSH inequalities and proof of nonlocality apply, a factorisable hidden variable model exists if and only if a noncontextual hidden variable model exists. This equivalence was proven to hold more generally in any experimental scenario by Samson Abramsky and Adam Brandenburger. It is for this reason that we may consider nonlocality to be a special case of contextuality. == Measures of contextuality ==
Measures of contextuality
Contextual fraction A number of methods exist for quantifying contextuality. One approach is by measuring the degree to which some particular noncontextuality inequality is violated, e.g. the KCBS inequality, the Yu–Oh inequality, or some Bell inequality. A more general measure of contextuality is the contextual fraction. that if the system is contextual (i.e., D\left(\mathcal{C}_{n}\right)>\Delta\left(\mathcal{C}_{n}\right)), \mathrm{CNT}_{2}=D\left(\mathcal{C}_{n}\right)-\Delta\left(\mathcal{C}_{n}\right), and if it is noncontextual ( D\left(\mathcal{C}_{n}\right)\leq\Delta\left(\mathcal{C}_{n}\right)), \mathrm{NCNT}_{2}=\min\left(\Delta\left(\mathcal{C}_{n}\right)-D\left(\mathcal{C}_{n}\right),m\left(\mathcal{C}_{n}\right)\right), where m\left(\mathcal{C}_{n}\right) is the L_1-distance from the vector \mathbf{p}\in\mathbb{P} to the surface of the box circumscribing the noncontextuality polytope. More generally, NCNT2 and CNT2 are computed by means of linear programming. == Contextuality as a resource for quantum computing ==
Contextuality as a resource for quantum computing
Recently, quantum contextuality has been investigated as a source of quantum advantage and computational speedups in quantum computing. Magic state distillation Magic state distillation is a scheme for quantum computing in which quantum circuits constructed only of Clifford operators, which by themselves are fault-tolerant but efficiently classically simulable, are injected with certain "magic" states that promote the computational power to universal fault-tolerant quantum computing. In 2014, Mark Howard, et al. showed that contextuality characterizes magic states for qubits of odd prime dimension and for qubits with real wavefunctions. Extensions to the qubit case have been investigated by Juani Bermejo Vega et al. Measurement-based quantum computing Measurement-based quantum computation (MBQC) is a model for quantum computing in which a classical control computer interacts with a quantum system by specifying measurements to be performed and receiving measurement outcomes in return. The measurement statistics for the quantum system may or may not exhibit contextuality. A variety of results have shown that the presence of contextuality enhances the computational power of an MBQC. In particular, researchers have considered an artificial situation in which the power of the classical control computer is restricted to only being able to compute linear Boolean functions, i.e. to solve problems in the Parity L complexity class ⊕L. For interactions with multi-qubit quantum systems a natural assumption is that each step of the interaction consists of a binary choice of measurement which in turn returns a binary outcome. An MBQC of this restricted kind is known as an l2-MBQC. Anders and Browne In 2009, Janet Anders and Dan Browne showed that two specific examples of nonlocality and contextuality were sufficient to compute a non-linear function. This in turn could be used to boost computational power to that of a universal classical computer, i.e. to solve problems in the complexity class P. This is sometimes referred to as measurement-based classical computation. The specific examples made use of the Greenberger–Horne–Zeilinger nonlocality proof and the supra-quantum Popescu–Rohrlich box. Raussendorf In 2013, Robert Raussendorf showed more generally that access to strongly contextual measurement statistics is necessary and sufficient for an l2-MBQC to compute a non-linear function. He also showed that to compute non-linear Boolean functions with sufficiently high probability requires contextuality. and in state-discrimination tasks. • In classical simulations of quantum systems, contextuality has been shown to incur memory costs. ==See also==
tickerdossier.comtickerdossier.substack.com