Lane detection under adverse conditions based on dual color space

Nima Zarbakht, Ju Jia Zou

Research output: Chapter in Book / Conference PaperConference Paperpeer-review

6 Citations (Scopus)

Abstract

![CDATA[A high level of situational awareness is essential to an advanced driver assistance system. One of the most important duties of such a system is the detection of lane markings on the road and to distinguish them from the road and other objects such as shadows, traffic, etc. A robust lane detection algorithm is critical to a lane departure warning system. It must determine the relative lane position reliably and rapidly using captured images. The available literature provides some methods to solve problems associated with adverse conditions such as precipitation, glare and blurred lane markings. However, the reliability of these methods can be adversely affected by the lighting conditions. In this paper, a new method is proposed that combines two distinct color spaces to reduce interference in a pre-processing step. The method is adaptive to different lighting situations. The directional gradient is used to detect the lane marking edges. The method can detect lane markings with different complexities imposed by shadows, rain, reflection, strong sources of light such as headlights and tail lights.]]
Original languageEnglish
Title of host publicationProceedings of 2018 International Conference on Digital Image Computing: Techniques and Applications (DICTA): Canberra, Australia, 10-13 December 2018
PublisherIEEE
Pages179-183
Number of pages5
ISBN (Print)9781538666029
DOIs
Publication statusPublished - 2018
EventDICTA (Conference) -
Duration: 10 Dec 2018 → …

Conference

ConferenceDICTA (Conference)
Period10/12/18 → …

Keywords

  • Hough transform
  • driver assistance systems
  • image processing
  • lane lines (roads)
  • road markings
  • traffic lanes

Fingerprint

Dive into the research topics of 'Lane detection under adverse conditions based on dual color space'. Together they form a unique fingerprint.

Cite this