Trajectory identification based on spatio-temporal proximity patterns between mobile phones

被引:2
|
作者
Higuchi, Takamasa [1 ]
Yamaguchi, Hirozumi [1 ]
Higashino, Teruo [1 ]
机构
[1] Osaka Univ, Grad Sch Informat Sci & Technol, 1-5 Yamadaoka, Suita, Osaka 5650871, Japan
关键词
Indoor localization; Trajectory identification; Proximity sensing; Laser range scanners; Bluetooth; LOCALIZATION; TRACKING; MODEL;
D O I
10.1007/s11276-015-0987-z
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Growing popularity of location-dependent mobile applications is continuously stimulating a demand for localization technology. However, in spite of significant research effort in the past decade, precise positioning in indoor environments is still an open problem. In this paper, we propose a novel type of indoor localization system that provides mobile phone users in a pedestrian crowd with their own position information of sub-meter accuracy by effectively utilizing a powerful pedestrian tracking capability of laser range scanners (i.e., laser-based distance measurement sensors). Although the laser-based tracking system can precisely detect presence of pedestrians at each location in its sensing region, the location information is not associated with any mobile phone users and thus it basically cannot provide the users' own locations. To remove this limitation, we focus on spatio-temporal proximity patterns between mobile phones, which can be detected by peer-to-peer short-range wireless communication (e.g., Bluetooth). By examining consistency between the communication logs and proximity between anonymous trajectories detected by laser-based tracking, our system identifies a trajectory that corresponds to each mobile phone user to offer their own position information. Through extensive simulations and field experiments, we show that our system can achieve trajectory identification accuracy of up to 91 %.
引用
收藏
页码:563 / 577
页数:15
相关论文
共 50 条
  • [1] Trajectory identification based on spatio-temporal proximity patterns between mobile phones
    Takamasa Higuchi
    Hirozumi Yamaguchi
    Teruo Higashino
    [J]. Wireless Networks, 2016, 22 : 563 - 577
  • [2] Mining Spatio-Temporal Patterns in Trajectory Data
    Kang, Juyoung
    Yong, Hwan-Seung
    [J]. JOURNAL OF INFORMATION PROCESSING SYSTEMS, 2010, 6 (04): : 521 - 536
  • [3] Mining Spatio-Temporal Semantic Trajectory for Groups Identification
    Cao, Yang
    Si, Yunfei
    Cai, Zhi
    Ding, Zhiming
    [J]. 2018 IEEE 9TH ANNUAL INFORMATION TECHNOLOGY, ELECTRONICS AND MOBILE COMMUNICATION CONFERENCE (IEMCON), 2018, : 308 - 313
  • [4] Spatio-Temporal Variations in Hand Movement Trajectory based Brain Activation Patterns
    Robinson, Neethu
    Vinod, A. P.
    Guan, Cuntai
    [J]. 2014 13TH INTERNATIONAL CONFERENCE ON CONTROL AUTOMATION ROBOTICS & VISION (ICARCV), 2014, : 23 - 28
  • [5] Dynamic proximity of spatio-temporal sequences
    Horn, D
    Dror, G
    Quenet, B
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2004, 15 (05): : 1002 - 1008
  • [6] SPATIO-TEMPORAL TRAJECTORY ANALYSIS OF MOBILE OBJECTS FOLLOWING THE SAME ITINERARY
    Etienne, Laurent
    Devogele, Thomas
    Bouju, Alain
    [J]. JOINT INTERNATIONAL CONFERENCE ON THEORY, DATA HANDLING AND MODELLING IN GEOSPATIAL INFORMATION SCIENCE, 2010, 38 : 86 - 91
  • [7] Density based spatio-temporal trajectory clustering algorithm
    Cheng, Zhiyuan
    Jiang, Ling
    Liu, Desheng
    Zheng, Zezhong
    [J]. IGARSS 2018 - 2018 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, 2018, : 3358 - 3361
  • [8] Cauchyian Motion: A Spatio-Temporal Scale Invariant Mobile Trajectory Model
    Tsai, I-Fei
    [J]. 2023 IEEE 98TH VEHICULAR TECHNOLOGY CONFERENCE, VTC2023-FALL, 2023,
  • [9] Spatio-Temporal GRU for Trajectory Classification
    Liu, Hong-Bin
    Wu, Hao
    Sun, Weiwei
    Lee, Ickjai
    [J]. 2019 19TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2019), 2019, : 1228 - 1233
  • [10] Challenges of spatio-temporal trajectory datasets
    Arslan, Muhammad
    Cruz, Christophe
    [J]. JOURNAL OF LOCATION BASED SERVICES, 2024, : 302 - 333