Fast uniform content-based satellite image registration using the scale-invariant feature transform descriptor

被引:3
|
作者
Bozorgi, Hamed [1 ]
Jafari, Ali [2 ]
机构
[1] Univ Guilan, Dept Elect Engn, Rasht 416353756, Iran
[2] Malek Ashtar Univ Technol, Sch Elect & Elect Engn, Tehran 158751774, Iran
关键词
Content-based image retrieval; Feature point distribution; Image registration; Linear discriminant analysis; Remote sensing; Scale-invariant feature transform; RECOGNITION; ALGORITHM;
D O I
10.1631/FITEE.1500295
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Content-based satellite image registration is a difficult issue in the fields of remote sensing and image processing. The difficulty is more significant in the case of matching multisource remote sensing images which suffer from illumination, rotation, and source differences. The scale-invariant feature transform (SIFT) algorithm has been used successfully in satellite image registration problems. Also, many researchers have applied a local SIFT descriptor to improve the image retrieval process. Despite its robustness, this algorithm has some difficulties with the quality and quantity of the extracted local feature points in multisource remote sensing. Furthermore, high dimensionality of the local features extracted by SIFT results in time-consuming computational processes alongside high storage requirements for saving the relevant information, which are important factors in content-based image retrieval (CBIR) applications. In this paper, a novel method is introduced to transform the local SIFT features to global features for multisource remote sensing. The quality and quantity of SIFT local features have been enhanced by applying contrast equalization on images in a pre-processing stage. Considering the local features of each image in the reference database as a separate class, linear discriminant analysis (LDA) is used to transform the local features to global features while reducing dimensionality of the feature space. This will also significantly reduce the computational time and storage required. Applying the trained kernel on verification data and mapping them showed a successful retrieval rate of 91.67% for test feature points.
引用
收藏
页码:1108 / 1116
页数:9
相关论文
共 50 条
  • [31] Speckle-reducing scale-invariant feature transform match for synthetic aperture radar image registration
    Wang, Xianmin
    Li, Bo
    Xu, Qizhi
    JOURNAL OF APPLIED REMOTE SENSING, 2016, 10
  • [34] Robust Scale-Invariant Feature Matching for Remote Sensing Image Registration
    Li, Qiaoliang
    Wang, Guoyou
    Liu, Jianguo
    Chen, Shaobo
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2009, 6 (02) : 287 - 291
  • [35] Image Processing based Improved Face Recognition for Mobile Devices by using Scale-Invariant Feature Transform
    Karthikeyan, C.
    Jabber, Bhukya
    Deepak, V
    Vamsidhar, E.
    PROCEEDINGS OF THE 5TH INTERNATIONAL CONFERENCE ON INVENTIVE COMPUTATION TECHNOLOGIES (ICICT-2020), 2020, : 716 - 722
  • [36] Geometrically robust image watermarking using scale-invariant feature transform and Zernike moments
    李雷达
    郭宝龙
    邵凯
    ChineseOpticsLetters, 2007, (06) : 332 - 335
  • [37] Image detection scale-invariant feature transform algorithm based on feature matching improves image matching accuracy
    Guo, Shuli
    Han, Lina
    Hao, Xiaoting
    JOURNAL OF THE AMERICAN COLLEGE OF CARDIOLOGY, 2017, 70 (16) : C10 - C10
  • [38] Geometrically robust image watermarking using scale-invariant feature transform and Zernike moments
    Li, Leida
    Guo, Baolong
    Shao, Kai
    CHINESE OPTICS LETTERS, 2007, 5 (06) : 332 - 335
  • [39] Synthetic Aperture Radar Image Matching Based on Improved Scale-Invariant Feature Transform
    Wang, Qing
    Tang, Tao
    Su, Yi
    2016 9TH INTERNATIONAL CONGRESS ON IMAGE AND SIGNAL PROCESSING, BIOMEDICAL ENGINEERING AND INFORMATICS (CISP-BMEI 2016), 2016, : 308 - 312
  • [40] Improved method for SAR image registration based on scale invariant feature transform
    Zhou, Deyun
    Zeng, Lina
    Liang, Junli
    Zhang, Kun
    IET RADAR SONAR AND NAVIGATION, 2017, 11 (04): : 579 - 585