Evaluation of Statistical Relationship of Random Variables via Mutual Information

被引:0
|
作者
Tsurko, V. V. [1 ]
Mikhalskii, A., I [1 ]
机构
[1] Russian Acad Sci, Trapeznikov Inst Control Sci, Moscow 117997, Russia
关键词
correlation coefficient; nonparametric evaluation of mutual information; reproducing-kernel Hilbert space; pentapeptide stability prediction;
D O I
10.1134/S000511792205006X
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We consider the use of nonparametric evaluation of mutual information to determine the relationship between random variables. It is shown that in the presence of a nonlinear relationship between random variables, the correlation coefficient can give an incorrect result. A method is proposed for constructing an evaluation of mutual information from empirical data in an abstract reproducing-kernel Hilbert space. Using the generalized representer theorem, a method for nonparametric evaluation of mutual information is proposed. The operability of the method is demonstrated using the examples of the analysis of artificial data. The application of the method in predicting the stability of pentapeptides is described.
引用
收藏
页码:734 / 742
页数:9
相关论文
共 50 条
  • [1] Evaluation of Statistical Relationship of Random Variables via Mutual Information
    V. V. Tsurko
    A. I. Mikhalskii
    [J]. Automation and Remote Control, 2022, 83 : 734 - 742
  • [2] Mutual information of several random variables and its estimation via variation
    Prelov, V. V.
    [J]. PROBLEMS OF INFORMATION TRANSMISSION, 2009, 45 (04) : 295 - 308
  • [3] Mutual information of several random variables and its estimation via variation
    V. V. Prelov
    [J]. Problems of Information Transmission, 2009, 45 : 295 - 308
  • [4] On the Mutual Information between Random Variables in Networks
    Xu, Xiaoli
    Thakor, Satyajit
    Guan, Yong Liang
    [J]. 2013 IEEE INFORMATION THEORY WORKSHOP (ITW), 2013,
  • [5] On mutual information estimation for mixed-pair random variables
    Beknazaryan, Aleksandr
    Dang, Xin
    Sang, Hailin
    [J]. STATISTICS & PROBABILITY LETTERS, 2019, 148 : 9 - 16
  • [6] A Bayesian Alternative to Mutual Information for the Hierarchical Clustering of Dependent Random Variables
    Marrelec, Guillaume
    Messe, Arnaud
    Bellec, Pierre
    [J]. PLOS ONE, 2015, 10 (09):
  • [7] The derivation of Mutual Information and covariance function using centered random variables.
    Razak, Fatimah Abdul
    [J]. INTERNATIONAL CONFERENCE ON QUANTITATIVE SCIENCES AND ITS APPLICATIONS (ICOQSIA 2014), 2014, 1635 : 883 - 889
  • [8] STATISTICAL EVALUATION OF THE DISTRIBUTION OF SUMS OF INDEPENDENT RANDOM-VARIABLES
    ILYUSHIN, VB
    SOLODYANNIKOV, YV
    [J]. IZVESTIYA VYSSHIKH UCHEBNYKH ZAVEDENII MATEMATIKA, 1983, (09): : 27 - 41
  • [9] Bounds on the rates of statistical divergences and mutual information via stochastic thermodynamics
    Karbowski, Jan
    [J]. PHYSICAL REVIEW E, 2024, 109 (05)
  • [10] Identifying Statistical Dependence in Genomic Sequences via Mutual Information Estimates
    Aktulga, Hasan Metin
    Kontoyiannis, Ioannis
    Lyznik, L. Alex
    Szpankowski, Lukasz
    Grama, Ananth Y.
    Szpankowski, Wojciech
    [J]. EURASIP JOURNAL ON BIOINFORMATICS AND SYSTEMS BIOLOGY, 2007, (01)