SpikeSEG: spiking segmentation via STDP saliency mapping

Paul Kirkland, Gaetano Di Caterina, John Soraghan, George Matich

Research output: Chapter in Book / Conference PaperConference Paperpeer-review

30 Citations (Scopus)

Abstract

Taking inspiration from the structure and behaviourof the human visual system and using the Transposed Convo-lution and Saliency Mapping methods of Convolutional NeuralNetworks (CNN), a spiking event-based image segmentationalgorithm, SpikeSEG is proposed. The approach makes use ofboth spike-based imaging and spike-based processing, where theimages are either standard images converted to spiking images orthey are generated directly from a neuromorphic event drivensensor, and then processed using a spiking fully convolutionalneural network. The spiking segmentation method uses the spikeactivations through time within the network to trace back anyoutputs from saliency maps, to the exact pixel location. Thisnot only gives exact pixel locations for spiking segmentation,but with low latency and computational overhead. SpikeSEGis the first spiking event-based segmentation network and overthree experiment test achieves promising results with 96%accuracy overall and a 74% mean intersection over union forthe segmentation, all within an event by event-based framework.
Original languageEnglish
Title of host publicationConference Proceedings: 2020 International Joint Conference on Neural Networks (IJCNN), July 19 - July 24, 2020, virtual
Place of PublicationU.S.
PublisherIEEE
Number of pages8
ISBN (Print)9781728169262
DOIs
Publication statusPublished - Jul 2020
Externally publishedYes
EventInternational Joint Conference on Neural Networks - virtually
Duration: 19 Jul 202024 Jul 2020

Conference

ConferenceInternational Joint Conference on Neural Networks
Abbreviated titleIJCNN
Period19/07/2024/07/20

Bibliographical note

Publisher Copyright:
© 2020 IEEE.

Fingerprint

Dive into the research topics of 'SpikeSEG: spiking segmentation via STDP saliency mapping'. Together they form a unique fingerprint.

Cite this