Attentional Masking for Pre-trained Deep Networks

被引:0
|
作者
Wallenberg, Marcus [1 ]
Forssen, Per-Erik [1 ]
机构
[1] Linkoping Univ, Dept Elect Engn, Linkoping, Sweden
基金
瑞典研究理事会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The ability to direct visual attention is a fundamental skill for seeing robots. Attention comes in two flavours: the gaze direction (overt attention) and attention to a specific part of the current field of view (covert attention), of which the latter is the focus of the present study. Specifically, we study the effects of attentional masking within pre-trained deep neural networks for the purpose of handling ambiguous scenes containing multiple objects. We investigate several variants of attentional masking on partially pre-trained deep neural networks and evaluate the effects on classification performance and sensitivity to attention mask errors in multi-object scenes. We find that a combined scheme consisting of multi-level masking and blending provides the best trade-off between classification accuracy and insensitivity to masking errors. This proposed approach is denoted multilayer continuous-valued convolutional feature masking (MC-CFM). For reasonably accurate masks it can suppress the influence of distracting objects and reach comparable classification performance to unmasked recognition in cases without distractors.
引用
收藏
页码:6149 / 6154
页数:6
相关论文
共 50 条
  • [1] Teaming Up Pre-Trained Deep Neural Networks
    Deabes, Wael
    Abdel-Hakim, Alaa E.
    [J]. 2018 INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING AND INFORMATION SECURITY (ICSPIS), 2018, : 73 - 76
  • [2] Detecting Deceptive Utterances Using Deep Pre-Trained Neural Networks
    Wawer, Aleksander
    Sarzynska-Wawer, Justyna
    [J]. APPLIED SCIENCES-BASEL, 2022, 12 (12):
  • [3] Medical Image Classification: A Comparison of Deep Pre-trained Neural Networks
    Alebiosu, David Olayemi
    Muhammad, Fermi Pasha
    [J]. 2019 17TH IEEE STUDENT CONFERENCE ON RESEARCH AND DEVELOPMENT (SCORED), 2019, : 306 - 310
  • [4] Alzheimer's disease classification using pre-trained deep networks
    Shanmugam, Jayanthi Venkatraman
    Duraisamy, Baskar
    Simon, Blessy Chittattukarakkaran
    Bhaskaran, Preethi
    [J]. BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2022, 71
  • [5] Semantic Segmentation of Mammograms Using Pre-Trained Deep Neural Networks
    Prates, Rodrigo Leite
    Gomez-Flores, Wilfrido
    Pereira, Wagner
    [J]. 2021 18TH INTERNATIONAL CONFERENCE ON ELECTRICAL ENGINEERING, COMPUTING SCIENCE AND AUTOMATIC CONTROL (CCE 2021), 2021,
  • [6] Comparative Analysis of Pre-trained Deep Neural Networks for Plant Disease Classification
    George, Romiyal
    Thuseethan, Selvarajah
    Ragel, Roshan G.
    [J]. 2024 21ST INTERNATIONAL JOINT CONFERENCE ON COMPUTER SCIENCE AND SOFTWARE ENGINEERING, JCSSE 2024, 2024, : 179 - 186
  • [7] Recognizing Malaysia Traffic Signs with Pre-Trained Deep Convolutional Neural Networks
    How, Dickson Neoh Tze
    Sahari, Khairul Salleh Mohamed
    Hou, Yew Cheong
    Basubeit, Omar Gumaan Saleh
    [J]. 2019 4TH INTERNATIONAL CONFERENCE ON CONTROL, ROBOTICS AND CYBERNETICS (CRC 2019), 2019, : 109 - 113
  • [8] Integration of pre-trained protein language models into geometric deep learning networks
    Fang Wu
    Lirong Wu
    Dragomir Radev
    Jinbo Xu
    Stan Z. Li
    [J]. Communications Biology, 6
  • [9] Integration of pre-trained protein language models into geometric deep learning networks
    Wu, Fang
    Wu, Lirong
    Radev, Dragomir
    Xu, Jinbo
    Li, Stan Z.
    [J]. COMMUNICATIONS BIOLOGY, 2023, 6 (01)
  • [10] Transfer Learning based Performance Comparison of the Pre-Trained Deep Neural Networks
    Kumar, Jayapalan Senthil
    Anuar, Syahid
    Hassan, Noor Hafizah
    [J]. INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2022, 13 (01) : 797 - 805