Applying Markov decision process to adaptive dynamic route selection model

被引:1
|
作者
Edrisi, Ali [1 ]
Bagherzadeh, Koosha [1 ]
Nadi, Ali [1 ]
机构
[1] KN Toosi Univ Technol, Civil Engn Dept, Tehran, Iran
关键词
traffic engineering; transport management; transport planning; NETWORK; OPPORTUNITIES; SYSTEMS;
D O I
10.1680/jtran.19.00085
中图分类号
TU [建筑科学];
学科分类号
0813 ;
摘要
Routing technologies have long been available in many automobiles and smart phones, but the nearly random nature of traffic on road networks has always encouraged further efforts to improve the reliability of navigation systems. Given the networks' uncertainty, an adaptive dynamic route selection model based on reinforcement learning is proposed. In the proposed method, the Markov decision process (MDP) is used to train simulated agents in a network so that they are able to make independent decisions under random conditions and, accordingly, determine the set of routes with the shortest travel time. The aim of the research was to integrate the MDP with a multi-nomial logit model (a widely used stochastic discrete-choice model) to improve finding the stochastic shortest path by computing the probability of selecting an arc from several interconnected arcs based on observations made at the arc location. The proposed model, tested with real data from part of the road network in Isfahan, Iran, and the results obtained demonstrated its good performance under 100 randomly applied stochastic scenarios.
引用
收藏
页码:359 / 372
页数:14
相关论文
共 50 条
  • [41] Application of Markov decision process as a mathematical model of operation and maintenance process
    Landowski, Bogdan
    SCIENTIFIC JOURNALS OF THE MARITIME UNIVERSITY OF SZCZECIN-ZESZYTY NAUKOWE AKADEMII MORSKIEJ W SZCZECINIE, 2010, 24 (96): : 12 - 16
  • [42] A Markov decision process for response-adaptive randomization in clinical trials
    Merrell, David
    Chandereng, Thevaa
    Park, Yeonhee
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2023, 178
  • [43] Implementation and Evaluation of Adaptive Video Streaming based on Markov Decision Process
    Bokani, Ayub
    Hoseini, S. Amir
    Hassan, Mahbub
    Kanhere, Sahl S.
    2016 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC), 2016,
  • [44] Adaptive Maintenance Policies for Aging Devices Using a Markov Decision Process
    Abeygunawardane, Saranga K.
    Jirutitijaroen, Panida
    Xu, Huan
    IEEE TRANSACTIONS ON POWER SYSTEMS, 2013, 28 (03) : 3194 - 3203
  • [45] Computerized Adaptive Testing: A Unified Approach Under Markov Decision Process
    Gilavert, Patricia
    Freire, Valdinei
    COMPUTATIONAL SCIENCE AND ITS APPLICATIONS, ICCSA 2022, PT I, 2022, 13375 : 591 - 602
  • [46] A Markov Decision Model for Adaptive Scheduling of Stored Scalable Videos
    Chen, Chao
    Heath, Robert W., Jr.
    Bovik, Alan C.
    de Veciana, Gustavo
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2013, 23 (06) : 1081 - 1095
  • [47] Adaptive Cross-layer Optimization Based on Markov Decision Process
    Velickovic, Z.
    Jevtovic, M.
    ELEKTRONIKA IR ELEKTROTECHNIKA, 2011, 108 (02) : 39 - 42
  • [48] Robust Adaptive Markov Decision Processes PLANNING WITH MODEL UNCERTAINTY
    Bertuccelli, Luca F.
    Wu, Albert
    How, Jonathan P.
    IEEE CONTROL SYSTEMS MAGAZINE, 2012, 32 (05): : 96 - 109
  • [49] Evaluation of linearly solvable Markov decision process with dynamic model learning in a mobile robot navigation task
    Kinjo, Ken
    Uchibe, Eiji
    Doya, Kenji
    FRONTIERS IN NEUROROBOTICS, 2013, 7
  • [50] Dynamic Route Selection in Route Planners
    Hartwig H. Hochmair
    KN - Journal of Cartography and Geographic Information, 2007, 57 (2) : 70 - 78