Location estimation based on feature mode matching with deep network models

被引:1
|
作者
Bai, Yu-Ting [1 ,2 ]
Jia, Wei [1 ]
Jin, Xue-Bo [1 ,2 ]
Su, Ting-Li [1 ]
Kong, Jian-Lei [1 ]
机构
[1] Beijing Technol & Business Univ, Sch Artificial Intelligence, Beijing, Peoples R China
[2] Beijing Technol & Business Univ, Beijing Lab Intelligent Environm Protect, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
location estimation; feature extraction; mode classification; deep networks; location system; ALGORITHM;
D O I
10.3389/fnbot.2023.1181864
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
IntroductionGlobal navigation satellite system (GNSS) signals can be lost in viaducts, urban canyons, and tunnel environments. It has been a significant challenge to achieve the accurate location of pedestrians during Global Positioning System (GPS) signal outages. This paper proposes a location estimation only with inertial measurements. MethodsA method is designed based on deep network models with feature mode matching. First, a framework is designed to extract the features of inertial measurements and match them with deep networks. Second, feature extraction and classification methods are investigated to achieve mode partitioning and to lay the foundation for checking different deep networks. Third, typical deep network models are analyzed to match various features. The selected models can be trained for different modes of inertial measurements to obtain localization information. The experiments are performed with the inertial mileage dataset from Oxford University. Results and discussionThe results demonstrate that the appropriate networks based on different feature modes have more accurate position estimation, which can improve the localization accuracy of pedestrians in GPS signal outages.
引用
收藏
页数:16
相关论文
共 50 条
  • [1] Soft Tissue Feature Tracking Based on Deep Matching Network
    Lu, Siyu
    Liu, Shan
    Hou, Pengfei
    Yang, Bo
    Liu, Mingzhe
    Yin, Lirong
    Zheng, Wenfeng
    CMES-COMPUTER MODELING IN ENGINEERING & SCIENCES, 2023, 136 (01): : 363 - 379
  • [2] A Fault Feature Matching-Based Fault Location for Distribution Network with DGs
    Tian, Aina
    Li, Weixing
    Liu, Zhongjian Kang Ruiying
    2016 IEEE INDUSTRIAL ELECTRONICS AND APPLICATIONS CONFERENCE (IEACON), 2016, : 167 - 172
  • [3] Feature Matching and Deep Learning Models for Attitude Estimation on a Micro-Aerial Vehicle
    Chumuang, Narumol
    Farooq, Adil
    Irfan, Muhammad
    Aziz, Sumair
    Qureshi, Moomal
    INTERNATIONAL CONFERENCE ON CYBERNETICS AND INNOVATIONS (ICCI 2022), 2022,
  • [4] Location recognition based on local feature matching
    Gao, Zhuoyue
    Chai, Lin
    Jin, Lizuo
    MIPPR 2019: AUTOMATIC TARGET RECOGNITION AND NAVIGATION, 2020, 11429
  • [5] Feature Relevance Network-Based Transfer Learning for Indoor Location Estimation
    Seok, Ho-Sik
    Hwang, Kyu-Baek
    Zhang, Byoung-Tak
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART C-APPLICATIONS AND REVIEWS, 2011, 41 (05): : 711 - 719
  • [6] Graph matching based on feature and spatial location information
    Chuanju Liu
    Dongmei Niu
    Xinghai Yang
    Xiuyang Zhao
    The Visual Computer, 2023, 39 : 711 - 722
  • [7] Graph matching based on feature and spatial location information
    Liu, Chuanju
    Niu, Dongmei
    Yang, Xinghai
    Zhao, Xiuyang
    VISUAL COMPUTER, 2023, 39 (02): : 711 - 722
  • [8] DeepMatcher: A deep transformer-based network for robust and accurate local feature matching
    Xie, Tao
    Dai, Kun
    Wang, Ke
    Li, Ruifeng
    Zhao, Lijun
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 237
  • [9] Underground target detection and location estimation based on scattering curve feature recognition network
    Zhang, Shuo
    Ren, Qiang
    Lei, Wentai
    Song, Qian
    Luo, Jiabin
    Luo, Shiguang
    Wang, Yiwei
    Xu, Long
    SEVENTH ASIA PACIFIC CONFERENCE ON OPTICS MANUFACTURE (APCOM 2021), 2022, 12166
  • [10] Feature-based Deep Learning of Proprioceptive Models for Robotic Force Estimation
    Berger, Erik
    Uhlig, Alexander
    PROCEEDINGS OF THE 2020 IEEE-RAS 20TH INTERNATIONAL CONFERENCE ON HUMANOID ROBOTS (HUMANOIDS 2020), 2021, : 128 - 134