MarketQuantum Fisher information
Company Profile

Quantum Fisher information

The quantum Fisher information is a central quantity in quantum metrology and is the quantum analogue of the classical Fisher information. It is one of the central quantities used to qualify the utility of an input state, especially in Mach–Zehnder interferometer-based phase or parameter estimation. It is shown that the quantum Fisher information can also be a sensitive probe of a quantum phase transition. The quantum Fisher information of a state with respect to the observable is defined as

Connection with Fisher information
Classical Fisher information of measuring observable B on density matrix \varrho(\theta) is defined as F[B,\theta]=\sum_b\frac{1}{p(b|\theta)}\left(\frac{\partial p(b|\theta)}{\partial \theta}\right)^2, where p(b|\theta)=\langle b\vert \varrho(\theta)\vert b \rangle is the probability of obtaining outcome b when measuring observable B on the transformed density matrix \varrho(\theta). b is the eigenvalue corresponding to eigenvector \vert b \rangle of observable B. Quantum Fisher information is the supremum of the classical Fisher information over all such observables, : F_{\rm Q}[\varrho,A]=\sup_{B} F[B,\theta]. ==Relation to the symmetric logarithmic derivative==
Relation to the symmetric logarithmic derivative
The quantum Fisher information equals the expectation value of L_{\varrho}^2, where L_{\varrho} is the symmetric logarithmic derivative ==Equivalent expressions==
Equivalent expressions
For a unitary encoding operation \varrho(\theta)=\exp(-iA\theta)\varrho_0\exp(+iA\theta),, the quantum Fisher information can be computed as an integral, : F_{\rm Q}[\varrho,A] = -2\int_0^\infty\text{tr}\left(\exp(-\rho_0 t)[\varrho_0,A] \exp(-\rho_0 t)[\varrho_0,A]\right)\ dt, where [\ ,\ ] on the right hand side denotes commutator. It can be also expressed in terms of Kronecker product and vectorization, : F_{\rm Q}[\varrho,A] = 2\,\text{vec}([\varrho_0,A])^\dagger\big(\rho_0^*\otimes {\rm I}+{\rm I}\otimes\rho_0\big)^{-1}\text{vec}([\varrho_0,A]), where ^* denotes complex conjugate, and ^\dagger denotes conjugate transpose. This formula holds for invertible density matrices. For non-invertible density matrices, the inverse above is substituted by the Moore-Penrose pseudoinverse. Alternatively, one can compute the quantum Fisher information for invertible state \rho_\nu=(1-\nu)\rho_0+\nu\pi, where \pi is any full-rank density matrix, and then perform the limit \nu \rightarrow 0^+ to obtain the quantum Fisher information for \rho_0. Density matrix \pi can be, for example, {\rm Identity}/\dim{\mathcal{H}} in a finite-dimensional system, or a thermal state in infinite dimensional systems. ==Generalization and relations to Bures metric and quantum fidelity==
Generalization and relations to Bures metric and quantum fidelity
For any differentiable parametrization of the density matrix \varrho(\boldsymbol{\theta}) by a vector of parameters \boldsymbol{\theta}=(\theta_1,\dots,\theta_n), the quantum Fisher information matrix is defined as : F_{\rm Q}^{ij}[\varrho(\boldsymbol{\theta})]=2\sum_{k,l} \frac{\operatorname{Re}(\langle k \vert \partial_{i}\varrho \vert l\rangle \langle l \vert \partial_{j}\varrho \vert k\rangle )}{\lambda_k+\lambda_l}, where \partial_i denotes partial derivative with respect to parameter \theta_i. The formula also holds without taking the real part \operatorname{Re}, because the imaginary part leads to an antisymmetric contribution that disappears under the sum. Note that all eigenvalues \lambda_k and eigenvectors \vert k\rangle of the density matrix potentially depend on the vector of parameters \boldsymbol{\theta}. This definition is identical to four times the Bures metric, up to singular points where the rank of the density matrix changes (those are the points at which \lambda_k+\lambda_l suddenly becomes zero.) Through this relation, it also connects with quantum fidelity F(\varrho,\sigma)=\left(\mathrm{tr}\left[\sqrt{\sqrt{\varrho}\sigma\sqrt{\varrho}}\right]\right)^2 of two infinitesimally close states, : F(\varrho_{\boldsymbol{\theta}},\varrho_{\boldsymbol{\theta}+d\boldsymbol{\theta}})=1-\frac{1}{4}\sum_{i,j}\Big(F_{\rm Q}^{ij}[\varrho(\boldsymbol{\theta})]+2\!\!\sum_{\lambda_k(\boldsymbol{\theta})=0}\!\!\partial_i\partial_j\lambda_k\Big)d\theta_i d\theta_j+\mathcal{O}(d\theta^3), where the inner sum goes over all k at which eigenvalues \lambda_k(\boldsymbol{\theta})=0. The extra term (which is however zero in most applications) can be avoided by taking a symmetric expansion of fidelity, : F\left(\varrho_{\boldsymbol{\theta}-d\boldsymbol{\theta}/2},\varrho_{\boldsymbol{\theta}+d\boldsymbol{\theta}/2}\right)=1-\frac{1}{4}\sum_{i,j}F_{\rm Q}^{ij}[\varrho(\boldsymbol{\theta})]d\theta_i d\theta_j+\mathcal{O}(d\theta^3). For n=1 and unitary encoding, the quantum Fisher information matrix reduces to the original definition. Quantum Fisher information matrix is a part of a wider family of quantum statistical distances. ==Relation to fidelity susceptibility==
Relation to fidelity susceptibility
Assuming that \vert \psi_0(\theta)\rangle is a ground state of a parameter-dependent non-degenerate Hamiltonian H(\theta), four times the quantum Fisher information of this state is called fidelity susceptibility, and denoted : \chi_F=4F_Q(\vert\psi_0(\theta)\rangle). Fidelity susceptibility measures the sensitivity of the ground state to the parameter, and its divergence indicates a quantum phase transition. This is because of the aforementioned connection with fidelity: a diverging quantum Fisher information means that \vert\psi_0(\theta)\rangle and \vert\psi_0(\theta+d\theta)\rangle are orthogonal to each other, for any infinitesimal change in parameter d\theta, and thus are said to undergo a phase-transition at point \theta. ==Convexity properties==
Convexity properties
The quantum Fisher information equals four times the variance for pure states : F_{\rm Q}[\vert \Psi \rangle,H] = 4 (\Delta H)^2_{\Psi} . For mixed states, when the probabilities are parameter independent, i.e., when p(\theta)=p , the quantum Fisher information is convex: :F_{\rm Q}[p \varrho_1(\theta) + (1-p) \varrho_2(\theta) ,H] \le p F_{\rm Q}[\varrho_1(\theta),H]+(1-p)F_{\rm Q}[\varrho_2(\theta),H]. The quantum Fisher information is the largest function that is convex and that equals four times the variance for pure states. That is, it equals four times the convex roof of the variance :F_{\rm Q}[\varrho,H] = 4 \inf_{\{p_k,\vert \Psi_k \rangle \}} \sum_k p_k (\Delta H)^2_{\Psi_k}, where the infimum is over all decompositions of the density matrix :\varrho=\sum_k p_k \vert \Psi_k\rangle \langle \Psi_k \vert. Note that \vert \Psi_k\rangle are not necessarily orthogonal to each other. The above optimization can be rewritten as an optimization over the two-copy space as : F_Q[\varrho,H]= \min_{\varrho_{12}} 2{\rm Tr}[(H\otimes {\rm Identity}-{\rm Identity}\otimes H)^2\varrho_{12}], such that \varrho_{12} is a symmetric separable state and : {\rm Tr}_1(\varrho_{12})={\rm Tr}_2(\varrho_{12})=\varrho. Later the above statement has been proved even for the case of a minimization over general (not necessarily symmetric) separable states. When the probabilities are \theta -dependent, an extended-convexity relation has been proved: :F_{\rm Q}\Big[\sum_i p_i(\theta) \varrho_i(\theta)\Big] \le \sum_i p_i(\theta) F_{\rm Q}[\varrho_i(\theta)]+F_{\rm C}[\{p_i(\theta)\}], where F_{\rm C}[\{p_i(\theta)\}]=\sum_i \frac{\partial_{\theta} p_i(\theta)^2}{p_i(\theta)} is the classical Fisher information associated to the probabilities contributing to the convex decomposition. The first term, in the right hand side of the above inequality, can be considered as the average quantum Fisher information of the density matrices in the convex decomposition. ==Inequalities for composite systems==
Inequalities for composite systems
We need to understand the behavior of quantum Fisher information in composite system in order to study quantum metrology of many-particle systems. For product states, :F_{\rm Q}[\varrho_1 \otimes \varrho_2 , H_1\otimes {\rm Identity}+{\rm Identity} \otimes H_2] = F_{\rm Q}[\varrho_1,H_1]+F_{\rm Q}[\varrho_2,H_2] holds. For the reduced state, we have :F_{\rm Q}[\varrho_{12}, H_1\otimes {\rm Identity}_2] \ge F_{\rm Q}[\varrho_{1}, H_1], where \varrho_{1}={\rm Tr}_2(\varrho_{12}). == Relation to entanglement ==
Relation to entanglement
There are strong links between quantum metrology and quantum information science. For a multiparticle system of N spin-1/2 particles :F_{\rm Q}[\varrho, J_z] \le N holds for separable states, where : J_z=\sum_{n=1}^N j_z^{(n)}, and j_z^{(n)} is a single particle angular momentum component. The maximum for general quantum states is given by :F_{\rm Q}[\varrho, J_z] \le N^2. Hence, quantum entanglement is needed to reach the maximum precision in quantum metrology. Moreover, for quantum states with an entanglement depth k, :F_{\rm Q}[\varrho, J_z] \le sk^2 + r^{2} holds, where s=\lfloor N/k \rfloor is the largest integer smaller than or equal to N/k, and r=N-sk is the remainder from dividing N by k. Hence, a higher and higher levels of multipartite entanglement is needed to achieve a better and better accuracy in parameter estimation. It is possible to obtain a weaker but simpler bound :F_{\rm Q}[\varrho, J_z] \le Nk. Hence, a lower bound on the entanglement depth is obtained as :\frac{F_{\rm Q}[\varrho, J_z]}{N} \le k. A related concept is the quantum metrological gain, which for a given Hamiltonian is defined as the ratio of the quantum Fisher information of a state and the maximum of the quantum Fisher information for the same Hamiltonian for separable states g_{\mathcal H}(\varrho)=\frac{\mathcal F_Q[\varrho,{\mathcal H}]}{\mathcal F_Q^{({\rm sep})}(\mathcal H)}, where the Hamiltonian is \mathcal H=h_1+h_2+...+h_N, and h_n acts on the nth spin. The metrological gain is defined by an optimization over all local Hamiltonians as g(\varrho)=\max_{\mathcal H}g_{\mathcal H}(\varrho). ==Measuring the Fisher information==
Measuring the Fisher information
The error propagation formula gives a lower bound on the quantum Fisher information : F_{\rm Q}[\varrho,H]\ge \frac{\langle i[H,M] \rangle_{\varrho}^2}{(\Delta M)^2} , where M is an operator. This formula can be used to put a lower bound on the quantum Fisher information from experimental results. If M equals the symmetric logarithmic derivative then the inequality is saturated. For the case of unitary dynamics, the quantum Fisher information is the convex roof of the variance. Based on that, one can obtain lower bounds on it, based on some given operator expectation values using semidefinite programming. The approach considers an optimization of the two-copy space. There are numerical methods that provide an optimal lower bound for the quantum Fisher information based on the expectation values for some operators, using the theory of Legendre transforms and not semidefinite programming. In some cases, the bounds can even be obtained analytically. For instance, for an N-qubit Greenberger-Horne-Zeilinger (GHZ) state : \frac{F_{\rm Q}[\varrho,J_z]}{N^2}\ge (1-2F_{\rm GHZ})^2, where for the fidelity with respect to the GHZ state : F_{\rm GHZ}={\rm Tr}(\varrho|{\rm GHZ}\rangle\langle{\rm GHZ}|)\ge1/2 holds, otherwise the optimal lower bound is zero. So far, we discussed bounding the quantum Fisher information for a unitary dynamics. It is also possible to bound the quantum Fisher information for the more general, non-unitary dynamics. The approach is based on the relation between the fidelity and the quantum Fisher information and that the fidelity can be computed based on semidefinite programming. For systems in thermal equibirum, the quantum Fisher information can be obtained from the dynamic susceptibility. ==Relation to the Wigner–Yanase skew information==
Relation to the Wigner–Yanase skew information
The Wigner–Yanase skew information is defined as :I(\varrho,H)={\rm Tr}(H^2\varrho)-{\rm Tr}(H \sqrt{\varrho} H \sqrt{\varrho}). It follows that I(\varrho,H) is convex in \varrho. For the quantum Fisher information and the Wigner–Yanase skew information, the inequality :F_{\rm Q}[\varrho,H] \ge 4 I(\varrho,H) holds, where there is an equality for pure states. ==Relation to the variance==
Relation to the variance
For any decomposition of the density matrix given by p_k and \vert \Psi_k\rangle the relation :(\Delta H)^2 \ge \sum_k p_k (\Delta H)^2_{\Psi_k} \ge \frac1 4 F_{\rm Q}[\varrho,H] holds, where both inequalities are tight. That is, there is a decomposition for which the second inequality is saturated, which is the same as stating that the quantum Fisher information is the convex roof of the variance over four, discussed above. There is also a decomposition for which the first inequality is saturated, which means that the variance is its own concave roof :(\Delta H)^2 = \sup_{\{p_k,\vert \Psi_k \rangle \}} \sum_k p_k (\Delta H)^2_{\Psi_k}. ==Uncertainty relations with the quantum Fisher information and the variance==
Uncertainty relations with the quantum Fisher information and the variance
Knowing that the quantum Fisher information is the convex roof of the variance times four, we obtain the relation (\Delta A)^2 F_Q[\varrho,B] \geq \vert \langle i[A,B]\rangle\vert^2, which is stronger than the Heisenberg uncertainty relation. For a particle of spin-j, the following uncertainty relation holds (\Delta J_x)^2+(\Delta J_y)^2+(\Delta J_z)^2\ge j, where J_l are angular momentum components. The relation can be strengthened as (\Delta J_x)^2+(\Delta J_y)^2+F_Q[\varrho,J_z]/4\ge j. ==References==
tickerdossier.comtickerdossier.substack.com