eduzhai > Applied Sciences > Engineering >

Multi-Sample Online Learning for Probabilistic Spiking Neural Networks

  • king
  • (0) Download
  • 20210506
  • Save

... pages left unread,continue reading

Document pages: 32 pages

Abstract: Spiking Neural Networks (SNNs) capture some of the efficiency of biologicalbrains for inference and learning via the dynamic, online, event-drivenprocessing of binary time series. Most existing learning algorithms for SNNsare based on deterministic neuronal models, such as leaky integrate-and-fire,and rely on heuristic approximations of backpropagation through time thatenforce constraints such as locality. In contrast, probabilistic SNN models canbe trained directly via principled online, local, update rules that have provento be particularly effective for resource-constrained systems. This paperinvestigates another advantage of probabilistic SNNs, namely their capacity togenerate independent outputs when queried over the same input. It is shown thatthe multiple generated output samples can be used during inference to robustifydecisions and to quantify uncertainty -- a feature that deterministic SNNmodels cannot provide. Furthermore, they can be leveraged for training in orderto obtain more accurate statistical estimates of the log-loss trainingcriterion, as well as of its gradient. Specifically, this paper introduces anonline learning rule based on generalized expectation-maximization (GEM) thatfollows a three-factor form with global learning signals and is referred to asGEM-SNN. Experimental results on structured output memorization andclassification on a standard neuromorphic data set demonstrate significantimprovements in terms of log-likelihood, accuracy, and calibration whenincreasing the number of samples used for inference and training.

Please select stars to rate!

         

0 comments Sign in to leave a comment.

    Data loading, please wait...
×