A Lane Detection Algorithm for Personal Vehicles

被引:3
|
作者
Kobayashi, Kazuyuki [1 ]
Watanabe, Kajiro [1 ]
Ohkubo, Tomoyuki [2 ,3 ]
Kurihara, Yosuke [2 ,3 ]
机构
[1] Hosei Univ, Fac Engn, Tokyo, Japan
[2] Hosei Univ, Grad Sch Engn, Tokyo, Japan
[3] Hosei Univ, Doctoral Program, Tokyo, Japan
关键词
lane detection; personal vehicle; omnidirectional camera; template matching;
D O I
10.1002/eej.21193
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
By the term "personal vehicle," we mean a simple and lightweight vehicle expected to emerge as a personal ground transportation device. The motorcycle, electric wheelchair, and motor-powered bicycle are examples of the personal vehicle and have been developed for personal transportation use. Recently, a new type of intelligent personal vehicle called the Segway has been developed, which is controlled and stabilized by using on-board intelligent multiple sensors. The demand for such personal vehicles is increasing: (1) to enhance human mobility, (2) to support mobility for elderly persons, and (3) to reduce environmental load. With the rapid growth of the personal vehicle market, the number of accidents caused by human error is also increasing. These accidents are associated with driving capabilities; to enhance or support driving capabilities as well as to prevent accidents, intelligent assistance is necessary. One of the most important elementary functions for personal vehicles is robust lane detection. In this paper, we develop a robust lane detection method for personal vehicles in outdoor environments. The proposed lane detection method employs a 360 degrees omnidirectional camera and unique robust image processing algorithm. In order to detect lanes, a combination of the template matching technique and the Hough transform is employed. The validity of the proposed lane detection algorithm was confirmed with a prototype vehicle under various types of sunshine conditions. (C) 2011 Wiley Periodicals, Inc. Electr Eng Jpn, 177(4): 23-32, 2011; Published online in Wiley Online Library (wileyonlinelibrary.com). DOI 10.1002/eej.21193
引用
收藏
页码:23 / 32
页数:10
相关论文
共 50 条
  • [21] Reduced Resolution Lane Detection Algorithm
    Dang, Li
    Tewolde, Girma
    Zhang, Xiaoyuan
    Kwon, Jaerock
    2017 IEEE AFRICON, 2017, : 1459 - 1464
  • [22] An Effective and Fast Lane Detection Algorithm
    Su, Chung-Yen
    Fan, Gen-Hau
    ADVANCES IN VISUAL COMPUTING, PT II, PROCEEDINGS, 2008, 5359 : 942 - 948
  • [23] The research on edge detection algorithm of lane
    Wang, Zhong-Xun
    Wang, Wenqi
    EURASIP JOURNAL ON IMAGE AND VIDEO PROCESSING, 2018,
  • [24] VHDL implementation of the lane detection algorithm
    Pankiewicz, P.
    Powiertowski, W.
    Roszak, G.
    MIXDES 2008: PROCEEDINGS OF THE 15TH INTERNATIONAL CONFERENCE ON MIXED DESIGN OF INTEGRATED CIRCUITS AND SYSTEMS, 2008, : 581 - 584
  • [25] The research on edge detection algorithm of lane
    Zhong-xun Wang
    Wenqi Wang
    EURASIP Journal on Image and Video Processing, 2018
  • [26] Lane detection algorithm for an onboard camera
    Bellino, M
    de Meneses, YL
    Ryser, P
    Jacot, J
    PHOTONICS IN THE AUTOMOBILE, 2005, 5663 : 102 - 111
  • [27] Lane Detection with Moving Vehicles Using Color Information
    Arshad, Nasim
    Moon, Kwang-Seok
    Park, Seung-Seob
    Kim, Jong-Nam
    WORLD CONGRESS ON ENGINEERING AND COMPUTER SCIENCE, WCECS 2011, VOL I, 2011, : 499 - 502
  • [28] A fast lane and vehicle detection approach for autonomous vehicles
    Wu, BF
    Lin, CT
    Chen, CJ
    Lai, TC
    Liao, HL
    Wu, A
    SEVENTH IASTED INTERNATIONAL CONFERENCE ON SIGNAL AND IMAGE PROCESSING, 2005, : 305 - 310
  • [29] Concurrent visual multiple lane detection for autonomous vehicles
    Gupta, Rachana Ashok
    Snyder, Wesley
    Pitts, W. Shepherd
    2010 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2010, : 2416 - 2422
  • [30] Visual detection and tracking of lane violations for heavy vehicles
    Irem Mutlukaya
    Riza Can Karakurt
    Sevval Cetinkaya
    Ertugrul Bayraktar
    Neural Computing and Applications, 2024, 36 (36) : 22633 - 22652