Local hidden variable theoretic measure of quantumness of mutual information

被引:0
|
作者
Puri, R. R. [1 ]
机构
[1] Indian Inst Technol, Dept Phys, Bombay 400076, Maharashtra, India
关键词
local hidden variables; quantum discord; symmetric discord; induced disturbance; measurement;
D O I
10.1088/1751-8113/47/11/115303
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Entanglement, a manifestation of quantumness of correlations between the observables of the subsystems of a composite system, and the quantumness of their mutual information are widely studied characteristics of a system of spin-1/2 particles. The concept of quantumness of correlations between the observables of a system is based on incommensurability of the correlations with the predictions of some local hidden variable (LHV) theory. However, the concept of quantumness of mutual information does not invoke the LHV theory explicitly. In this paper, the concept of quantumness of mutual information for a system of two spin-1/2 particles, named A and B, in the state described by the density matrix (rho) over cap (AB) is formulated by invoking explicitly the LHV theory. To that end, the classical mutual information I(a, b) of the spins is assumed to correspond to the joint probability p(epsilon(A)(a); epsilon(B)(b)) (epsilon(A)(a), epsilon(B)(b) = +/- 1) for the spin A to have the component epsilon(B)(b)/2 in the direction b, constructed by invoking the LHV theory. The quantumness of mutual information is then defined as Q(LHV) = I-Q (vertical bar(rho) over cap (AB)) - I-LHV where I-Q((rho) over cap (AB)) is the quantum theoretic information content in the state (rho) over cap (AB) and the LHV theoretic classical information I-LHV is defined in terms of I(a, b) by choosing the directions a, b as follows. The choice of the directions a, b is made by finding the Bloch vectors <(S) over cap (A)> and <(S) over cap (B)> of the spins A and B where (S) over cap (A) <(S) over capB > is the spin vector of spin a (spin B) and <(P) over cap > = Tr ((P) over cap (AB)((rho) over cap)). The directions a and b are taken to be along the Bloch vector of A and B respectively if those Bloch vectors are non-zero. In that case I-LHV = I(a,b) and Q(LHV) turns out to be identical with the measurement induced disturbance. If <(S) over cap (A)> = <(S) over cap (B)> =0, then I-LHV is defined to be the maximum of I(a, b) over a and b. The said optimization in this case can be performed analytically exactly and Q(LHV) is then found to be the same as the symmetric discord. IF <(S) over cap (A)> = 0, <(S) over cap (B)> not equal 0, then I-LHV is defined to be the maximum of I(a, b) over a with b = <(S) over cap (B) >/vertical bar(S) over cap (B)vertical bar. The Q(LHV) is then the same as the quantum discord for measurement on A if the eigenstates of S-(B) over cap.b are also the eigenstates of the operator <+/-, a(m)|vertical bar(rho) over cap (AB)vertical bar +/-, a(m)> on B where a(m) is the direction of optimization of spin A for evaluation of the quantum discord and vertical bar +/-, am > are the eigenstates of S-(Lambda) over cap . a(m).
引用
收藏
页数:10
相关论文
共 50 条
  • [31] FSMI: Fast computation of Shannon Mutual Information for information-theoretic mapping
    Zhang, Thengdong
    Henderson, Trevor
    Sze, Vivienne
    Karaman, Sertac
    2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2019, : 6912 - 6918
  • [32] A Fast Information-Theoretic Approximation of Joint Mutual Information Feature Selection
    Liu, Heng
    Ditzler, Gregory
    2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 4610 - 4617
  • [33] Adaptation of Mutual Information Measure by Using Image Gradient Information
    Cheah, Tan Chye
    Shanmugam, S. Anandan
    Ang, Li-Minn
    JOURNAL OF MEDICAL IMAGING AND HEALTH INFORMATICS, 2012, 2 (03) : 313 - 319
  • [34] FSMI: Fast computation of Shannon mutual information for information-theoretic mapping
    Zhang, Zhengdong
    Henderson, Theia
    Karaman, Sertac
    Sze, Vivienne
    INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2020, 39 (09): : 1155 - 1177
  • [35] CRMI: Confidence-Rich Mutual Information for Information-Theoretic Mapping
    Xu, Yang
    Zheng, Ronghao
    Liu, Meiqin
    Zhang, Senlin
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2021, 6 (04) : 6434 - 6441
  • [36] Computation of mutual information from Hidden Markov Models
    Reker, Daniel
    Katzenbeisser, Stefan
    Hamacher, Kay
    COMPUTATIONAL BIOLOGY AND CHEMISTRY, 2010, 34 (5-6) : 328 - 333
  • [37] INVESTORS' MISPERCEPTION: HIDDEN INFORMATION IN THE MUTUAL FUND INDUSTRY
    Ionescu, Ionel Eduard
    Covaci, Brindusa
    Mitea-Popia, Carmen Crina
    Nicolae, Florina
    PROCEEDINGS OF THE 3RD INTERNATIONAL CONFERENCE ON MANAGEMENT, MARKETING AND FINANCES, 2009, : 194 - 199
  • [38] Squeeziness: An information theoretic measure for avoiding fault masking
    Clark, David
    Hierons, Robert M.
    INFORMATION PROCESSING LETTERS, 2012, 112 (8-9) : 335 - 340
  • [39] Information-theoretic measure of genuine multiqubit entanglement
    Cai, Jian-Ming
    Zhou, Zheng-Wei
    Zhou, Xing-Xiang
    Guo, Guang-Can
    PHYSICAL REVIEW A, 2006, 74 (04):
  • [40] An information theoretic measure for the evaluation of ordinal scale data
    Tastle, W. J.
    Wierman, M. J.
    BEHAVIOR RESEARCH METHODS, 2006, 38 (03) : 487 - 494