Attention and self-attention in random forests

被引:0
|
作者
Lev V. Utkin
Andrei V. Konstantinov
Stanislav R. Kirpichenko
机构
[1] Peter the Great St.Petersburg Polytechnic University,Higher School of Artificial Intelligence
来源
关键词
Attention mechanism; Random forest; Nadaraya–Watson regression; Quadratic programming; Linear programming; Contamination model;
D O I
暂无
中图分类号
学科分类号
摘要
New models of random forests jointly using the attention and self-attention mechanisms are proposed for solving the regression problem. The models can be regarded as extensions of the attention-based random forest whose idea stems from applying a combination of the Nadaraya–Watson kernel regression and the Huber’s contamination model to random forests. The self-attention aims to capture dependencies of the tree predictions and to remove noise or anomalous predictions in the random forest. The self-attention module is trained jointly with the attention module for computing weights. It is shown that the training process of attention weights is reduced to solving a single quadratic or linear optimization problem. Three modifications of the self-attention are proposed and compared. A specific multi-head self-attention for the random forest is also considered. Heads of the self-attention are obtained by changing its tuning parameters including the kernel parameters and the contamination parameter of models. The proposed modifications of the attention and self-attention combinations are verified and compared with other random forest models by using several datasets. The code implementing the corresponding algorithms is publicly available.
引用
下载
收藏
页码:257 / 273
页数:16
相关论文
共 50 条
  • [1] Attention and self-attention in random forests
    Utkin, Lev V.
    Konstantinov, Andrei V.
    Kirpichenko, Stanislav R.
    PROGRESS IN ARTIFICIAL INTELLIGENCE, 2023, 12 (03) : 257 - 273
  • [2] SHYNESS AND SELF-ATTENTION
    CROZIER, WR
    BULLETIN OF THE BRITISH PSYCHOLOGICAL SOCIETY, 1983, 36 (FEB): : A5 - A5
  • [3] FOCUS OF ATTENTION IN GROUPS - A SELF-ATTENTION PERSPECTIVE
    MULLEN, B
    CHAPMAN, JG
    PEAUGH, S
    JOURNAL OF SOCIAL PSYCHOLOGY, 1989, 129 (06): : 807 - 817
  • [4] Self-attention random forest for breast cancer image classification
    Li, Jia
    Shi, Jingwen
    Chen, Jianrong
    Du, Ziqi
    Huang, Li
    FRONTIERS IN ONCOLOGY, 2023, 13
  • [5] Self-Attention for Cyberbullying Detection
    Pradhan, Ankit
    Yatam, Venu Madhav
    Bera, Padmalochan
    2020 INTERNATIONAL CONFERENCE ON CYBER SITUATIONAL AWARENESS, DATA ANALYTICS AND ASSESSMENT (CYBER SA 2020), 2020,
  • [6] On the Integration of Self-Attention and Convolution
    Pan, Xuran
    Ge, Chunjiang
    Lu, Rui
    Song, Shiji
    Chen, Guanfu
    Huang, Zeyi
    Huang, Gao
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, : 805 - 815
  • [7] The Lipschitz Constant of Self-Attention
    Kim, Hyunjik
    Papamakarios, George
    Mnih, Andriy
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [8] The function of the self-attention network
    Cunningham, Sheila J.
    COGNITIVE NEUROSCIENCE, 2016, 7 (1-4) : 21 - 22
  • [9] On The Computational Complexity of Self-Attention
    Keles, Feyza Duman
    Wijewardena, Pruthuvi Mahesakya
    Hegde, Chinmay
    INTERNATIONAL CONFERENCE ON ALGORITHMIC LEARNING THEORY, VOL 201, 2023, 201 : 597 - 619
  • [10] Self-Attention Graph Pooling
    Lee, Junhyun
    Lee, Inyeop
    Kang, Jaewoo
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97