Spatial-Temporal Recurrent Neural Network for Anomalous Trajectories Detection

被引:7
|
作者
Cheng, Yunyao [1 ,2 ]
Wu, Bin [1 ,2 ]
Song, Li [1 ,2 ]
Shi, Chuan [1 ,2 ]
机构
[1] Beijing Univ Posts & Telecommun, Beijing, Peoples R China
[2] Beijing Key Lab Intelligent Telecommun Software &, Beijing, Peoples R China
来源
ADVANCED DATA MINING AND APPLICATIONS, ADMA 2019 | 2019年 / 11888卷
关键词
Anomaly detection; Recurrent Neural Network; Spatial-temporal sequence;
D O I
10.1007/978-3-030-35231-8_41
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Aiming to improve the quality of taxi service and protect the interests in passengers, anomalous trajectory detection attracts increasing attention. Most of the existing methods concentrate on the coordinate information about trajectories and learn the similarities between anomalous trajectories from a large number of coordinate sequences. These methods ignore the relationship of spatial-temporal and ignore the particularity of the whole trajectory. Through data analysis, we find that there are significant differences between normal trajectories and anomalous trajectories in terms of spatial-temporal characteristic. Meanwhile Recurrent Neural Network can use trajectory embedding to capture the sequential information on the trajectory. Consequently, we propose an efficient method named Spatial-Temporal Recurrent Neural Network (ST-RNN) using coordinate sequence and spatial-temporal sequence. ST-RNN combines the advantages of the Recurrent Neural Network (RNN) in learning sequence information and adds attention mechanism to the RNN to improve the performance of the model. The application of Spatial-Temporal Laws in anomalous trajectory detection also achieves a positive influence. Several experiments on a real-world dataset demonstrate that the proposed ST-RNN achieves state-of-the-art performance in most cases.
引用
收藏
页码:565 / 578
页数:14
相关论文
共 50 条
  • [41] SDebrisNet: A Spatial-Temporal Saliency Network for Space Debris Detection
    Tao, Jiang
    Cao, Yunfeng
    Ding, Meng
    APPLIED SCIENCES-BASEL, 2023, 13 (08):
  • [42] MSTN: Multistage Spatial-Temporal Network for Driver Drowsiness Detection
    Shih, Tun-Huai
    Hsu, Chiou-Ting
    COMPUTER VISION - ACCV 2016 WORKSHOPS, PT III, 2017, 10118 : 146 - 153
  • [43] Spatial-temporal graph attention network for video anomaly detection
    Chen, Haoyang
    Mei, Xue
    Ma, Zhiyuan
    Wu, Xinhong
    Wei, Yachuan
    IMAGE AND VISION COMPUTING, 2023, 131
  • [44] SPATIAL-TEMPORAL FEATURE AGGREGATION NETWORK FOR VIDEO OBJECT DETECTION
    Chen, Zhu
    Li, Weihai
    Fei, Chi
    Liu, Bin
    Yu, Nenghai
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 1858 - 1862
  • [45] Network traffic classification using deep convolutional recurrent autoencoder neural networks for spatial-temporal features extraction
    D'Angelo, Gianni
    Palmieri, Francesco
    JOURNAL OF NETWORK AND COMPUTER APPLICATIONS, 2021, 173
  • [46] Exploring Spatial-Temporal Representations for fNIRS-based Intimacy Detection via an Attention-enhanced Cascade Convolutional Recurrent Neural Network
    Li, Chao
    Zhang, Qian
    Zhao, Ziping
    Gu, Li
    Schuller, Bjoern
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 8862 - 8869
  • [47] A dynamic neural network with local connections as spatial-temporal associative memory
    Kotov, V.B.
    Radiotekhnika i Elektronika, 2002, 47 (09): : 1083 - 1090
  • [48] Decoupled Dynamic Spatial-Temporal Graph Neural Network for Traffic Forecasting
    Shao, Zezhi
    Zhang, Zhao
    Wei, Wei
    Wang, Fei
    Xu, Yongjun
    Cao, Xin
    Jensen, Christian S.
    PROCEEDINGS OF THE VLDB ENDOWMENT, 2022, 15 (11): : 2733 - 2746
  • [49] Biologically Inspired Spatial-Temporal Perceiving Strategies for Spiking Neural Network
    Zheng, Yu
    Xue, Jingfeng
    Liu, Jing
    Zhang, Yanjun
    BIOMIMETICS, 2025, 10 (01)
  • [50] Spatial-Temporal Dynamic Graph Convolutional Neural Network for Traffic Prediction
    Xiao, Wenjuan
    Wang, Xiaoming
    IEEE ACCESS, 2023, 11 : 97920 - 97929