Inference in spiking Bayesian neurons using stochastic computation

Chetan Singh Thakur, Jamal Molin, Ralph Etienne-Cummings, Andre van Schaik

Research output: Chapter in Book / Conference PaperConference Paperpeer-review

2 Citations (Scopus)

Abstract

We present a stochastic Bayesian neuron (SBN) that codes for a binary hidden variable and the temporal dynamics of which can be explained as a Bayesian inference. We show that our SBN combines the maximum likelihood of its synaptic inputs and the prior probability of the hidden variable to infer the presence of the hidden variable. Probabilistic models are computationally complex, which makes them difficult to implement using standard state-of-the-art digital implementation. Here, we employ stochastic logic elements to implement the SBN using minimum hardware resources. The SBN could be used as a basic element to develop a Bayesian processor that works on probability instead of deterministic logic.
Original languageEnglish
Title of host publicationProceedings of the 51st Annual Conference on Information Sciences and Systems (CISS), Baltimore, Maryland, 22-24 March 2017
PublisherIEEE
Number of pages4
ISBN (Print)9781509047802
DOIs
Publication statusPublished - 2017
EventConference on Information Sciences and Systems -
Duration: 22 Mar 2017 → …

Conference

ConferenceConference on Information Sciences and Systems
Period22/03/17 → …

Keywords

  • computational neuroscience
  • neurons
  • stochastic models

Fingerprint

Dive into the research topics of 'Inference in spiking Bayesian neurons using stochastic computation'. Together they form a unique fingerprint.

Cite this