Information-Theoretic Metric Learning: 2-D Linear Projections of Neural Data for Visualization

被引:0
|
作者
Brockmeier, Austin J. [1 ]
Giraldo, Luis G. Sanchez [1 ]
Emigh, Matthew S. [1 ]
Bae, Jihye [1 ]
Choi, John S. [2 ]
Francis, Joseph T. [2 ]
Principe, Jose C. [1 ]
机构
[1] Univ Florida, Dept Elect & Comp Engn, Gainesville, FL 32611 USA
[2] State Univ New York Downstate Med Sch, Dept Physiol & Pharmacol, Brooklyn, NY 11203 USA
关键词
DIMENSIONALITY REDUCTION;
D O I
暂无
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Intracortical neural recordings are typically high-dimensional due to many electrodes, channels, or units and high sampling rates, making it very difficult to visually inspect differences among responses to various conditions. By representing the neural response in a low-dimensional space, a researcher can visually evaluate the amount of information the response carries about the conditions. We consider a linear projection to 2-D space that also parametrizes a metric between neural responses. The projection, and corresponding metric, should preserve class-relevant information pertaining to different behavior or stimuli. We find the projection as a solution to the information-theoretic optimization problem of maximizing the information between the projected data and the class labels. The method is applied to two datasets using different types of neural responses: motor cortex neuronal firing rates of a macaque during a center-out reaching task, and local field potentials in the somatosensory cortex of a rat during tactile stimulation of the forepaw. In both cases, projected data points preserve the natural topology of targets or peripheral touch sites. Using the learned metric on the neural responses increases the nearest-neighbor classification rate versus the original data; thus, the metric is tuned to distinguish among the conditions.
引用
收藏
页码:5586 / 5589
页数:4
相关论文
共 50 条
  • [1] An information-theoretic derivation of the 2-D maximum entropy spectrum
    Choi, B
    [J]. IEEE SIGNAL PROCESSING LETTERS, 1998, 5 (10) : 271 - 272
  • [2] Distributed Information-Theoretic Metric Learning in Apache Spark
    Su, Yuxin
    Yang, Haiqin
    King, Irwin
    Lyu, Michael
    [J]. 2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2016, : 3306 - 3313
  • [3] Mathematical Analysis on Information-Theoretic Metric Learning With Application to Supervised Learning
    Choi, Jooyeon
    Min, Chohong
    Lee, Byungjoon
    [J]. IEEE ACCESS, 2019, 7 : 121998 - 122005
  • [4] An information-theoretic learning algorithm for neural network classification
    Miller, DJ
    Rao, A
    Rose, K
    Gersho, A
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 8: PROCEEDINGS OF THE 1995 CONFERENCE, 1996, 8 : 591 - 597
  • [5] Symmetric Information-Theoretic Metric Learning for Target Detection in Hyperspectral Imagery
    Dong, Yanni
    Zhang, Yuxiang
    Du, Bo
    [J]. IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2022, 15 : 1470 - 1480
  • [7] Information-Theoretic Semi-Supervised Metric Learning via Entropy Regularization
    Niu, Gang
    Dai, Bo
    Yamada, Makoto
    Sugiyama, Masashi
    [J]. NEURAL COMPUTATION, 2014, 26 (08) : 1717 - 1762
  • [8] Data modeling in machine learning based on information-theoretic measures
    Liu, YH
    Li, AJ
    Luo, SW
    [J]. 2002 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-4, PROCEEDINGS, 2002, : 1219 - 1222
  • [9] Kernelized Information-Theoretic Metric Learning for Cancer Diagnosis Using High-Dimensional Molecular Profiling Data
    Xiong, Feiyu
    Kam, Moshe
    Hrebien, Leonid
    Wang, Beilun
    Qi, Yanjun
    [J]. ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2016, 10 (04)
  • [10] Hyperspectral Target Detection via Adaptive Information-Theoretic Metric Learning with Local Constraints
    Dong, Yanni
    Du, Bo
    Zhang, Liangpei
    Hu, Xiangyun
    [J]. REMOTE SENSING, 2018, 10 (09)