Deep Ensemble Transformers for Dimensionality Reduction

被引:0
|
作者
Nareklishvili, Maria [1 ,2 ]
Geitle, Marius [3 ]
机构
[1] Univ Oslo, Dept Econ, N-0313 Oslo, Norway
[2] Univ Chicago, Booth Sch Business, Chicago, IL 60637 USA
[3] Ostfold Univ Coll, Dept Comp Sci & Commun, N-1757 Halden, Norway
关键词
Consistent random forests; consistent representation learning; dimensionality reduction; gene expression data; neural networks;
D O I
10.1109/TNNLS.2024.3357621
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose deep ensemble transformers (DETs), a fast, scalable approach for dimensionality reduction problems. This method leverages the power of deep neural networks and employs cascade ensemble techniques as its fundamental feature extraction tool. To handle high-dimensional data, our approach employs a flexible number of intermediate layers sequentially. These layers progressively transform the input data into decision tree predictions. To further enhance prediction performance, the output from the final intermediate layer is fed through a feed-forward neural network architecture for final prediction. We derive an upper bound of the disparity between the generalization error and the empirical error and demonstrate that it converges to zero. This highlights the generalizability of our method to parameter estimation and feature selection problems. In our experimental evaluations, DETs outperform existing models in terms of prediction accuracy, representation learning ability, and computational time. Specifically, the method achieves over 95% accuracy in gene expression data and can be trained on average 50% faster than traditional artificial neural networks (ANNs).
引用
收藏
页码:1 / 12
页数:12
相关论文
共 50 条
  • [1] Exploring Dimensionality Reduction Techniques in Multilingual Transformers
    Huertas-Garcia, Alvaro
    Martin, Alejandro
    Huertas-Tato, Javier
    Camacho, David
    [J]. COGNITIVE COMPUTATION, 2023, 15 (02) : 590 - 612
  • [2] Exploring Dimensionality Reduction Techniques in Multilingual Transformers
    Álvaro Huertas-García
    Alejandro Martín
    Javier Huertas-Tato
    David Camacho
    [J]. Cognitive Computation, 2023, 15 : 590 - 612
  • [3] Heterogeneous sets in dimensionality reduction and ensemble learning
    Reeve, Henry W. J.
    Kaban, Ata
    Bootkrajang, Jakramate
    [J]. MACHINE LEARNING, 2024, 113 (04) : 1683 - 1704
  • [4] Heterogeneous sets in dimensionality reduction and ensemble learning
    Henry W. J. Reeve
    Ata Kabán
    Jakramate Bootkrajang
    [J]. Machine Learning, 2024, 113 : 1683 - 1704
  • [5] Dimensionality reduction and ensemble of LSTMs for antimicrobial resistance prediction
    Hernandez-Carnerero, Alvar
    Sanchez-Marre, Miquel
    Mora-Jimenez, Inmaculada
    Soguero-Ruiz, Cristina
    Martinez-Aguero, Sergio
    Alvarez-Rodriguez, Joaquin
    [J]. ARTIFICIAL INTELLIGENCE IN MEDICINE, 2023, 138
  • [6] Building Ensemble of Deep Networks: Convolutional Networks and Transformers
    Nanni, Loris
    Loreggia, Andrea
    Barcellona, Leonardo
    Ghidoni, Stefano
    [J]. IEEE ACCESS, 2023, 11 : 124962 - 124974
  • [7] A Comparison of Ensemble and Dimensionality Reduction DEA Models Based on Entropy Criterion
    Pendharkar, Parag C.
    [J]. ALGORITHMS, 2020, 13 (09)
  • [8] Predicting Drug Target Interactions Using Dimensionality Reduction with Ensemble Learning
    Sachdev, Kanica
    Gupta, Manoj K.
    [J]. PROCEEDINGS OF RECENT INNOVATIONS IN COMPUTING, ICRIC 2019, 2020, 597 : 79 - 89
  • [9] Dimensionality reduction for hyperspectral image classification based on multiview graphs ensemble
    Chen, Puhua
    Jiao, Licheng
    Liu, Fang
    Zhao, Jiaqi
    Zhao, Zhiqiang
    [J]. JOURNAL OF APPLIED REMOTE SENSING, 2016, 10
  • [10] Microblog Dimensionality Reduction-A Deep Learning Approach
    Xu, Lei
    Jiang, Chunxiao
    Ren, Yong
    Chen, Hsiao-Hwa
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2016, 28 (07) : 1779 - 1789