Assessing Conceptual Complexity and Compressibility Using Information Gain and Mutual Information

被引:6
|
作者
Mathy, Fabien [1 ]
机构
[1] Univ Franche Comte, Besancon, France
关键词
D O I
10.20982/tqmp.06.1.p016
中图分类号
学科分类号
摘要
In this paper, a few basic notions stemming from information theory are presented with the intention of modeling the abstraction of relevant information in categorization tasks. In a categorization task, a single output variable is the basis for performing a dichotomic classification of objects that can be distinguished by a set of input variables which are more or less informative about the category to which the objects belong. At the beginning of the experiment, the target classification is unknown to learners who must select the most informative variables relative to the class in order to succeed in classifying the objects efficiently. I first show how the notion of entropy can be used to characterize basic psychological processes in learning. Then, I indicate how a learner might use information gain and mutual information - both based on entropy-to efficiently induce the shortest rule for categorizing a set of objects. Several basic classification tasks are studied in succession with the aim of showing that learning can improve as long as subjects are able to compress information. Referring to recent experimental results, I indicate in the Conclusion that these notions can account for both strategies and performance in subjects trying to simplify a learning process.
引用
收藏
页码:16 / 30
页数:15
相关论文
共 50 条
  • [1] Visual Dictionary Pruning Using Mutual Information and Information Gain
    Artiemjew, Piotr
    Gorecki, Przemyslaw
    ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING, ICAISC 2014, PT II, 2014, 8468 : 3 - 14
  • [2] Shape complexity based on mutual information
    Rigau, J
    Feixas, M
    Sbert, M
    INTERNATIONAL CONFERENCE ON SHAPE MODELING AND APPLICATIONS, PROCEEDINGS, 2005, : 355 - 360
  • [3] MEASURING COMPLEXITY IN TERMS OF MUTUAL INFORMATION
    FRASER, AM
    MEASURES OF COMPLEXITY AND CHAOS, 1989, 208 : 117 - 119
  • [4] Assessing complexity in cellular automata using information theory
    Chliamovitch, Gregor
    Velasquez, Lino
    Falcone, Jean-Luc
    Chopard, Bastien
    INTERNATIONAL JOURNAL OF PARALLEL EMERGENT AND DISTRIBUTED SYSTEMS, 2019, 34 (01) : 142 - 160
  • [5] Analysis of the query expansion for medical collections using mutual information, information gain and MeSH ontology
    Perea-Ortega, Jose M.
    Montejo-Raez, Arturo
    Diaz-Galiano, Manuel C.
    Garcia-Cumbreras, Miguel A.
    PROCESAMIENTO DEL LENGUAJE NATURAL, 2011, (47): : 13 - 20
  • [6] Synergy and Redundancy in Dual Decompositions of Mutual Information Gain and Information Loss
    Chicharro, Daniel
    Panzeri, Stefano
    ENTROPY, 2017, 19 (02)
  • [7] MEASUREMENT OF INFORMATION COMPLEXITY .I. CONCEPTUAL STRUCTURE AND INFORMATION PATTERN AS FACTORS IN INFORMATION PROCESSING
    SUEDFELD, P
    HAGEN, RL
    JOURNAL OF PERSONALITY AND SOCIAL PSYCHOLOGY, 1966, 4 (02) : 233 - &
  • [8] Prediction of response to incision using the mutual information and complexity of electroencephalograms during anaesthesia
    Huang, Li-yu
    Wang, Wei-xun
    Ju, Feng-chi
    Cheng, Jing-zhi
    Chinese Journal of Biomedical Engineering, 2003, 22 (02) : 97 - 103
  • [9] A new measurement of complexity for studying EEG mutual information
    Chen, F
    Xu, JH
    Gu, FJ
    Liu, ZR
    Iiu, R
    ICONIP'98: THE FIFTH INTERNATIONAL CONFERENCE ON NEURAL INFORMATION PROCESSING JOINTLY WITH JNNS'98: THE 1998 ANNUAL CONFERENCE OF THE JAPANESE NEURAL NETWORK SOCIETY - PROCEEDINGS, VOLS 1-3, 1998, : 435 - 437
  • [10] Information gain as a tool for assessing biosignature missions
    Fields, Benjamin
    Gupta, Sohom
    Sandora, McCullen
    INTERNATIONAL JOURNAL OF ASTROBIOLOGY, 2023, 22 (05) : 583 - 607