Relations frequency hypermatrices in mutual, conditional and joint entropy-based information indices

被引:30
|
作者
Barigye, Stephen J. [1 ]
Marrero-Ponce, Yovani [1 ,2 ,3 ]
Martinez-Lopez, Yoan [1 ,4 ]
Torrens, Francisco [2 ]
Manuel Artiles-Martinez, Luis [1 ]
Pino-Urias, Ricardo W. [1 ]
Martinez-Santiago, Oscar [1 ,5 ]
机构
[1] Univ Cent Martha Abreu Las Villas, Fac Chem Pharm, CAMD BIR Unit, Unit Comp Aided Mol Biosil Discovery & Bioinforma, Santa Clara 54830, Villa Clara, Cuba
[2] Univ Valencia, Inst Univ Ciencia Mol, E-46071 Valencia, Spain
[3] Univ Valencia, Fac Farm, Dept Quim Fis, Unidad Invest Diseno Farmacos & Conectividad Mol, E-46071 Valencia, Spain
[4] Camaguey Univ, Fac Informat, Dept Comp Sci, Camaguey City 74650, Camaguey, Cuba
[5] Univ Cent Martha Abreu Las Villas, Fac Chem Pharm, Dept Chem Sci, Santa Clara 54830, Villa Clara, Cuba
关键词
relations frequency matrix; hypermatrix; information index; variability analysis; physico-chemical property; 2-furylethylene derivative; QSPR study; MOLECULAR DESCRIPTORS; CONNECTIVITY INDEXES; QSPR/QSAR;
D O I
10.1002/jcc.23123
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Graph-theoretic matrix representations constitute the most popular and significant source of topological molecular descriptors (MDs). Recently, we have introduced a novel matrix representation, named the duplex relations frequency matrix, F, derived from the generalization of an incidence matrix whose row entries are connected subgraphs of a given molecular graph G. Using this matrix, a series of information indices (IFIs) were proposed. In this report, an extension of F is presented, introducing for the first time the concept of a hypermatrix in graph-theoretic chemistry. The hypermatrix representation explores the n-tuple participation frequencies of vertices in a set of connected subgraphs of G. In this study we, however, focus on triple and quadruple participation frequencies, generating triple and quadruple relations frequency matrices, respectively. The introduction of hypermatrices allows us to redefine the recently proposed MDs, that is, the mutual, conditional, and joint entropy-based IFIs, in a generalized way. These IFIs are implemented in GT-STAF (acronym for Graph Theoretical Thermodynamic STAte Functions), a new module of the TOMOCOMD-CARDD program. Information theoretic-based variability analysis of the proposed IFIs suggests that the use of hypermatrices enhances the entropy and, hence, the variability of the previously proposed IFIs, especially the conditional and mutual entropy based IFIs. The predictive capacity of the proposed IFIs was evaluated by the analysis of the regression models, obtained for physico-chemical properties the partition coefficient (Log P) and the specific rate constant (Log K) of 34 derivatives of 2-furylethylene. The statistical parameters, for the best models obtained for these properties, were compared to those reported in the literature depicting better performance. This result suggests that the use of the hypermatrix-based approach, in the redefinition of the previously proposed IFIs, avails yet other valuable tools beneficial in QSPR studies and diversity analysis. (C) 2012 Wiley Periodicals, Inc.
引用
收藏
页码:259 / 274
页数:16
相关论文
共 50 条
  • [1] Shannon's, Mutual, Conditional and Joint Entropy Information Indices: Generalization of Global Indices Defined from Local Vertex Invariants
    Barigye, Stephen J.
    Marrero-Ponce, Yovani
    Martinez Santiago, Oscar
    Martinez Lopez, Yoan
    Perez-Gimenez, Facundo
    Torrens, Francisco
    CURRENT COMPUTER-AIDED DRUG DESIGN, 2013, 9 (02) : 164 - 183
  • [2] ENTROPY-BASED SEGREGATION INDICES
    Mora, Ricardo
    Ruiz-Castillo, Javier
    SOCIOLOGICAL METHODOLOGY 2011, VOL 41, 2011, 41 : 159 - 194
  • [3] Hybrid sampling on mutual information entropy-based clustering ensembles for optimizations
    Wang, Feng
    Yang, Cheng
    Lin, Zhiyi
    Li, Yuanxiang
    Yuan, Yuan
    NEUROCOMPUTING, 2010, 73 (7-9) : 1457 - 1464
  • [4] Remarks on Renyi versions of conditional entropy and mutual information
    Aishwarya, Gautam
    Madiman, Mokshay
    2019 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2019, : 1117 - 1121
  • [5] CONDITIONAL ENTROPY AND MUTUAL INFORMATION IN RANDOM CASCADING PROCESSES
    WU, YF
    LIU, LS
    PHYSICAL REVIEW D, 1991, 43 (09): : 3077 - 3079
  • [6] Estimation of Mutual Information and Conditional Entropy for Surveillance Optimization
    Le, Duc H.
    Reynolds, Albert C.
    SPE JOURNAL, 2014, 19 (04): : 648 - 661
  • [7] Belavkin-Staszewski Relative Entropy, Conditional Entropy, and Mutual Information
    Zhai, Yuan
    Yang, Bo
    Xi, Zhengjun
    ENTROPY, 2022, 24 (06)
  • [8] Accuracy of joint entropy and mutual information estimates
    Bazsó, F
    Petróczi, A
    Zalányi, L
    2004 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2004, : 2843 - 2846
  • [9] Entropy-based representation of image information
    Ferraro, M
    Boccignone, G
    Caelli, T
    PATTERN RECOGNITION LETTERS, 2002, 23 (12) : 1391 - 1398
  • [10] Information Entropy-Based Leakage Profiling
    Ou, Changhai
    Zhou, Xinping
    Lam, Siew-Kei
    Zhou, Chengju
    Ning, Fangxin
    IEEE TRANSACTIONS ON COMPUTER-AIDED DESIGN OF INTEGRATED CIRCUITS AND SYSTEMS, 2021, 40 (06) : 1052 - 1062