OrientSTS-Spatio Temporal Sequence Searching for Trip Planning

被引:1
|
作者
Zhou, Chunjie [1 ]
Dai, Pengfei [2 ]
Zhang, Zhenxing [1 ]
机构
[1] Ludong Univ, Sch Informat & Elect Engn, Yantai, Peoples R China
[2] Yantai Cloudoer Software Co Ltd, Yantai, Peoples R China
基金
中国国家自然科学基金;
关键词
Optimal; Personal Profiles; Personalized; Sequencescene Features; Social Networks; Spatio-Temporal; Trip Planning; SYSTEM;
D O I
10.4018/IJWSR.2018040102
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
For a satisfactory trip planning, the following features are desired: 1) automated suggestion of scenes or attractions; 2) personalized based on the interest and habits of travelers; 3) maximal coverage of sites of interest; and 4) minimal effort such as transporting time on the route. Automated scene suggestion requires collecting massive knowledge about scene sites and their characteristics, and personalized planning requires matching of a traveler profile with knowledge of scenes of interest. As a trip contains a sequence of stops at multiple scenes, the problem of trip planning becomes optimizing a temporal sequence where each stop is weighted. This article presents OrientSTS, a novel spatio-temporal sequence (STS) searching system for optimal personalized trip planning. OrientSTS provides a knowledge base of scenes with their tagged features and season characteristics. By combining personal profiles and scene features, OrientSTS generates a set of weighted scenes for each city for each user. OrientSTS can then retrieve the optimal sequence of scenes in terms of distance, weight, visiting time, and scene features. The authors develop alternative algorithms for searching optimal sequences, with consideration of the weight of each scene, the preference of users, and the travel time constraint. The experiments demonstrate the efficiency of the proposed algorithms based on real datasets from social networks.
引用
下载
收藏
页码:21 / 46
页数:26
相关论文
共 50 条
  • [1] OrientSTS: Spatio-Temporal Sequence Searching in Flickr
    Zhou, Chunjie
    Liu, Dongqi
    Meng, Xiaofeng
    PROCEEDINGS OF THE 34TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR'11), 2011, : 1265 - 1265
  • [2] Searching for significance in spatio-temporal firing patterns
    Gerstein, GL
    ACTA NEUROBIOLOGIAE EXPERIMENTALIS, 2004, 64 (02) : 203 - 207
  • [3] Techniques for Efficiently Searching in Spatial, Temporal, Spatio-temporal, and Multimedia Databases
    Kriegel, Hans-Peter
    Kroeger, Peer
    Renz, Matthias
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS, PROCEEDINGS, 2009, 5463 : 780 - 783
  • [4] Techniques for Efficiently Searching in Spatial, Temporal, Spatio-temporal, and Multimedia Databases
    Kriegel, Hans-Peter
    Kroeger, Peer
    Renz, Matthias
    26TH INTERNATIONAL CONFERENCE ON DATA ENGINEERING ICDE 2010, 2010, : 1218 - 1219
  • [5] Spatio-temporal patterns of antennal movements in the searching cockroach
    Okada, J
    Toh, Y
    JOURNAL OF EXPERIMENTAL BIOLOGY, 2004, 207 (21): : 3693 - 3706
  • [6] Searching for Spatio-Temporal-Keyword Patterns in Semantic Trajectories
    Gryllakis, Fragkiskos
    Pelekis, Nikos
    Doulkeridis, Christos
    Sideridis, Stylianos
    Theodoridis, Yannis
    ADVANCES IN INTELLIGENT DATA ANALYSIS XVI, IDA 2017, 2017, 10584 : 112 - 124
  • [7] A model for spatio-temporal network planning
    Nash, E
    James, P
    Parker, D
    COMPUTERS & GEOSCIENCES, 2005, 31 (02) : 135 - 143
  • [8] Spatio-Temporal Segmentation for Radiotherapy Planning
    Stawiaski, Jean
    Decenciere, Etienne
    Bidault, Francois
    PROGRESS IN INDUSTRIAL MATHEMATICS AT ECMI 2008, 2010, 15 : 223 - +
  • [9] Video sequence matching with spatio-temporal constraints\
    Ren, W
    Singh, S
    PROCEEDINGS OF THE 17TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, VOL 3, 2004, : 834 - 837
  • [10] Spatio-temporal consistency enhancement for disparity sequence
    Liu, Haixu
    Liu, Chenyu
    Tang, Yufang
    Sun, Haohui
    Li, Xueming
    International Journal of Signal Processing, Image Processing and Pattern Recognition, 2014, 7 (05) : 229 - 238