MSA-Net: Establishing Reliable Correspondences by Multiscale Attention Network

被引:20
|
作者
Zheng, Linxin [1 ,2 ,3 ]
Xiao, Guobao [1 ]
Shi, Ziwei [1 ]
Wang, Shiping [2 ,3 ]
Ma, Jiayi [4 ]
机构
[1] Minjiang Univ, Coll Comp & Control Engn, Fuzhou 350108, Peoples R China
[2] Fuzhou Univ, Coll Comp & Data Sci, Fuzhou 350108, Peoples R China
[3] Fuzhou Univ, Coll Software, Fuzhou 350108, Peoples R China
[4] Wuhan Univ, Elect Informat Sch, Wuhan 430072, Peoples R China
基金
中国国家自然科学基金;
关键词
Task analysis; Feature extraction; Robustness; Pose estimation; Context modeling; Deep learning; Data mining; Outlier removal; deep learning; wide-baseline stereo;
D O I
10.1109/TIP.2022.3186535
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we propose a novel multi-scale attention based network (called MSA-Net) for feature matching problems. Current deep networks based feature matching methods suffer from limited effectiveness and robustness when applied to different scenarios, due to random distributions of outliers and insufficient information learning. To address this issue, we propose a multi-scale attention block to enhance the robustness to outliers, for improving the representational ability of the feature map. In addition, we also design a novel context channel refine block and a context spatial refine block to mine the information context with less parameters along channel and spatial dimensions, respectively. The proposed MSA-Net is able to effectively infer the probability of correspondences being inliers with less parameters. Extensive experiments on outlier removal and relative pose estimation have shown the performance improvements of our network over current state-of-the-art methods with less parameters on both outdoor and indoor datasets. Notably, our proposed network achieves an 11.7% improvement at error threshold 5 degrees without RANSAC than the state-of-the-art method on relative pose estimation task when trained on YFCC100M dataset.
引用
收藏
页码:4598 / 4608
页数:11
相关论文
共 50 条
  • [21] ICA-Net: Industrial defect detection network based on convolutional attention and of multiscale features
    Zhao, ShiLong
    Li, Gang
    Zhou, MingLe
    Li, Min
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 126
  • [22] Tiny-Crack-Net: A multiscale feature fusion network with attention mechanisms for segmentation of tiny cracks
    Chu, Honghu
    Wang, Wei
    Deng, Lu
    COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, 2022, 37 (14) : 1914 - 1931
  • [23] MGU-Net: a multiscale gate attention encoder-decoder network for medical image segmentation
    Liu, Le
    Chen, Qi
    Su, Jian
    Du, Xiao Gang
    Lei, Tao
    Wan, Yong
    INTERNATIONAL JOURNAL OF COMPUTER APPLICATIONS IN TECHNOLOGY, 2023, 71 (04) : 275 - 285
  • [24] MSAL-Net: improve accurate segmentation of nuclei in histopathology images by multiscale attention learning network
    Ali, Haider
    ul Haq, Imran
    Cui, Lei
    Feng, Jun
    BMC MEDICAL INFORMATICS AND DECISION MAKING, 2022, 22 (01)
  • [25] DmADs-Net: dense multiscale attention and depth-supervised network for medical image segmentation
    Fu, Zhaojin
    Li, Jinjiang
    Chen, Zheng
    Ren, Lu
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, : 523 - 548
  • [26] MSAL-Net: improve accurate segmentation of nuclei in histopathology images by multiscale attention learning network
    Haider Ali
    Imran ul Haq
    Lei Cui
    Jun Feng
    BMC Medical Informatics and Decision Making, 22
  • [27] MSA-Net:一种基于多阶段注意力机制的少样本目标检测方法
    汤应薇
    张荣福
    丁然
    张杰
    光学仪器, 2023, 45 (06) : 14 - 24
  • [28] Multiscale Attention U-Net for Skin Lesion Segmentation
    Alahmadi, Mohammad D.
    IEEE ACCESS, 2022, 10 : 59145 - 59154
  • [29] Video Dehazing Network Based on Multiscale Attention
    Han, Congwei
    Zhang, Kangkang
    Brekhna, Brekhna
    IEEE ACCESS, 2023, 11 : 94479 - 94485
  • [30] EAMR-Net: A multiscale effective spatial and cross-channel attention network for retinal vessel segmentation
    Prethija G.
    Katiravan J.
    Mathematical Biosciences and Engineering, 2024, 21 (03) : 4742 - 4761