Event-based color segmentation with a high dynamic range sensor

Alexandre Marcireau, Sio-Hoi Leng, Camille Simon-Chane, Ryad B. Benosman

Research output: Contribution to journalArticlepeer-review

16 Citations (Scopus)

Abstract

This paper introduces a color asynchronous neuromorphic event-based camera and a methodology to process color output from the device to perform color segmentation and tracking at the native temporal resolution of the sensor (down to one microsecond). Our color vision sensor prototype is a combination of three Asynchronous Time-based Image Sensors, sensitive to absolute color information. We devise a color processing algorithm leveraging this information. It is designed to be computationally cheap, thus showing how low level processing benefits from asynchronous acquisition and high temporal resolution data. The resulting color segmentation and tracking performance is assessed both with an indoor controlled scene and two outdoor uncontrolled scenes. The tracking's mean error to the ground truth for the objects of the outdoor scenes ranges from two to twenty pixels.
Original languageEnglish
Article number135
Number of pages13
JournalFrontiers in Neuroscience
Volume12
DOIs
Publication statusPublished - 2018

Open Access - Access Right Statement

Copyright © 2018 Marcireau, Ieng, Simon-Chane and Benosman. This is an open- access article distributed under the terms of the Creative Commons Attribution License (CC BY) (https://creativecommons.org/licenses/by/4.0/). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

Keywords

  • color
  • signal processing
  • vision

Fingerprint

Dive into the research topics of 'Event-based color segmentation with a high dynamic range sensor'. Together they form a unique fingerprint.

Cite this