Abstract
We present a stochastic Bayesian neuron (SBN) that codes for a binary hidden variable and the temporal dynamics of which can be explained as a Bayesian inference. We show that our SBN combines the maximum likelihood of its synaptic inputs and the prior probability of the hidden variable to infer the presence of the hidden variable. Probabilistic models are computationally complex, which makes them difficult to implement using standard state-of-the-art digital implementation. Here, we employ stochastic logic elements to implement the SBN using minimum hardware resources. The SBN could be used as a basic element to develop a Bayesian processor that works on probability instead of deterministic logic.
Original language | English |
---|---|
Title of host publication | Proceedings of the 51st Annual Conference on Information Sciences and Systems (CISS), Baltimore, Maryland, 22-24 March 2017 |
Publisher | IEEE |
Number of pages | 4 |
ISBN (Print) | 9781509047802 |
DOIs | |
Publication status | Published - 2017 |
Event | Conference on Information Sciences and Systems - Duration: 22 Mar 2017 → … |
Conference
Conference | Conference on Information Sciences and Systems |
---|---|
Period | 22/03/17 → … |
Keywords
- computational neuroscience
- neurons
- stochastic models