Attentional Masking for Pre-trained Deep Networks

被引:0
|
作者
Wallenberg, Marcus [1 ]
Forssen, Per-Erik [1 ]
机构
[1] Linkoping Univ, Dept Elect Engn, Linkoping, Sweden
基金
瑞典研究理事会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The ability to direct visual attention is a fundamental skill for seeing robots. Attention comes in two flavours: the gaze direction (overt attention) and attention to a specific part of the current field of view (covert attention), of which the latter is the focus of the present study. Specifically, we study the effects of attentional masking within pre-trained deep neural networks for the purpose of handling ambiguous scenes containing multiple objects. We investigate several variants of attentional masking on partially pre-trained deep neural networks and evaluate the effects on classification performance and sensitivity to attention mask errors in multi-object scenes. We find that a combined scheme consisting of multi-level masking and blending provides the best trade-off between classification accuracy and insensitivity to masking errors. This proposed approach is denoted multilayer continuous-valued convolutional feature masking (MC-CFM). For reasonably accurate masks it can suppress the influence of distracting objects and reach comparable classification performance to unmasked recognition in cases without distractors.
引用
收藏
页码:6149 / 6154
页数:6
相关论文
共 50 条
  • [41] Pre-trained Convolutional Neural Networks for the Lung Sounds Classification
    Vaityshyn, Valentyn
    Porieva, Hanna
    Makarenkova, Anastasiia
    [J]. 2019 IEEE 39TH INTERNATIONAL CONFERENCE ON ELECTRONICS AND NANOTECHNOLOGY (ELNANO), 2019, : 522 - 525
  • [42] Stretch-Wide Traffic State Prediction Using Discriminatively Pre-trained Deep Neural Networks
    Elhenawy, Mohammed
    Rakha, Hesham
    [J]. 2016 IEEE 19TH INTERNATIONAL CONFERENCE ON INTELLIGENT TRANSPORTATION SYSTEMS (ITSC), 2016, : 1065 - 1070
  • [43] Object detection and classification of butterflies using efficient CNN and pre-trained deep convolutional neural networks
    R. Faerie Mattins
    M. Vergin Raja Sarobin
    Azrina Abd Aziz
    S. Srivarshan
    [J]. Multimedia Tools and Applications, 2024, 83 : 48457 - 48482
  • [44] Deep pre-trained FWI: where supervised learning meets the physics-informed neural networks
    Muller, Ana P. O.
    Costa, Jesse C.
    Bom, Clecio R.
    Klatt, Matheus
    Faria, Elisangela L.
    de Albuquerque, Marcelo P.
    de Albuquerque, Marcio P.
    [J]. GEOPHYSICAL JOURNAL INTERNATIONAL, 2023, 235 (01) : 119 - 134
  • [45] A comparative analysis of paddy crop biotic stress classification using pre-trained deep neural networks
    Malvade, Naveen N.
    Yakkundimath, Rajesh
    Saunshi, Girish
    Elemmi, Mahantesh C.
    Baraki, Parashuram
    [J]. ARTIFICIAL INTELLIGENCE IN AGRICULTURE, 2022, 6 : 167 - 175
  • [46] Improving the accuracy using pre-trained word embeddings on deep neural networks for Turkish text classification
    Aydogan, Murat
    Karci, Ali
    [J]. PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2020, 541
  • [48] Object detection and classification of butterflies using efficient CNN and pre-trained deep convolutional neural networks
    Mattins, R. Faerie
    Sarobin, M. Vergin Raja
    Aziz, Azrina Abd
    Srivarshan, S.
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 83 (16) : 48457 - 48482
  • [49] Calibration of Pre-trained Transformers
    Desai, Shrey
    Durrett, Greg
    [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 295 - 302
  • [50] Context-Dependent Pre-Trained Deep Neural Networks for Large-Vocabulary Speech Recognition
    Dahl, George E.
    Yu, Dong
    Deng, Li
    Acero, Alex
    [J]. IEEE TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2012, 20 (01): : 30 - 42