Shannon information entropy in the canonical genetic code

被引:28
|
作者
Nemzer, Louis R. [1 ]
机构
[1] Nova Southeastern Univ, Chem & Phys, 3$01 Coll Ave, Davie, FL 33314 USA
关键词
Genetic code; RNA translation; Information theory; Shannon entropy; Amino acids; ERROR MINIMIZATION; TRANSFER-RNAS; ORIGIN; EVOLUTION; MODEL; SUBSTITUTION; COEVOLUTION; OPTIMALITY;
D O I
10.1016/j.jtbi.2016.12.010
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
The Shannon entropy measures the expected information value of messages. As with thermodynamic entropy, the Shannon entropy is only defined within a system that identifies at the outset the collections of possible messages, analogous to microstates, that will be considered indistinguishable macrostates. This fundamental insight is applied here for the first time to amino acid alphabets, which group the twenty common amino acids into families based on chemical and physical similarities. To evaluate these schemas objectively, a novel quantitative method is introduced based the inherent redundancy in the canonical genetic code. Each alphabet is taken as a separate system that partitions the 64 possible RNA codons, the microstates, into families, the macrostates. By calculating the normalized mutual information, which measures the reduction in Shannon entropy, conveyed by single nucleotide messages, groupings that best leverage this aspect of fault tolerance in the code are identified. The relative importance of properties related to protein folding-like hydropathy and size-and function, including side-chain acidity, can also be estimated. This approach allows the quantification of the average information value of nucleotide positions, which can shed light on the coevolution of the canonical genetic code with the tRNA-protein translation mechanism.
引用
收藏
页码:158 / 170
页数:13
相关论文
共 50 条
  • [1] Shannon Information Entropy as Complexity Metric of Source Code
    Cholewa, Marcin
    [J]. PROCEEDINGS OF THE 24TH INTERNATIONAL CONFERENCE MIXED DESIGN OF INTEGRATED CIRCUITS AND SYSTEMS - MIXDES 2017, 2017, : 468 - 471
  • [2] Processing Combat Information with Shannon Entropy and Improved Genetic Algorithm
    Guo, Songyun
    Yang, Yanping
    Yan, Mingming
    [J]. 2015 8TH INTERNATIONAL CONFERENCE ON BIOMEDICAL ENGINEERING AND INFORMATICS (BMEI), 2015, : 857 - 861
  • [3] Shannon Entropy Analysis of the Genome Code
    Tenreiro Machado, J. A.
    [J]. MATHEMATICAL PROBLEMS IN ENGINEERING, 2012, 2012
  • [4] SHANNON-INFORMATION IS NOT ENTROPY
    SCHIFFER, M
    [J]. PHYSICS LETTERS A, 1991, 154 (7-8) : 361 - 365
  • [5] Physical information entropy and probability Shannon entropy
    R. Ascoli
    R. Urigu
    [J]. International Journal of Theoretical Physics, 1997, 36 : 1691 - 1716
  • [6] Physical information entropy and probability shannon entropy
    Ascoli, R
    Urigu, R
    [J]. INTERNATIONAL JOURNAL OF THEORETICAL PHYSICS, 1997, 36 (08) : 1691 - 1716
  • [7] The Shannon Information entropy of protein sequences
    Strait, BJ
    Dewey, TG
    [J]. BIOPHYSICAL JOURNAL, 1996, 71 (01) : 148 - 155
  • [8] An entropy measure for finite strings based on the Shannon entropy of a code set
    Speidel, U
    [J]. 2003 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY - PROCEEDINGS, 2003, : 25 - 25
  • [9] Dynamical Shannon entropy and information Tsallis entropy in complex systems
    Yulmetyev, RM
    Emelyanova, NA
    Gafarov, FM
    [J]. PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2004, 341 : 649 - 676
  • [10] Shannon's Measure of Information and the Thermodynamic Entropy
    Ben-Naim, Arieh
    [J]. BAYESIAN INFERENCE AND MAXIMUM ENTROPY METHODS IN SCIENCE AND ENGINEERING, 2012, 1443 : 129 - 142