Parents:
 Wikipedia Spiking Neural Network
ANN models that more closely mimic natural neural networks. In addition to neuronal and synaptic state, SNNs also incorporate the concept of time. Neurons in the SNN do not fire at each propagation cycle (as it happens with typical multi-layer perceptron networks), but rather fire only when a membrane potential – an intrinsic quality of the neuron related to its membrane electrical charge – reaches a specific value. When a neuron fires, it generates a signal which travels to other neurons
Related Tags:
 
CloseDescendants
  • Spiking Neural Networks, the Next Generation of Machine Learning (About)
    2019-01-29  
  • Training deep neural networks for binary communication with the Whetstone method | Nature Machine Intelligence (About)
    > Here, we describe a new approach to training SNNs, where the ANN training is to not only learn the task, but to produce a SNN in the process. Specifically, if the training procedure can include the eventual objective of low-precision communication between nodes, the training process of a SNN can be nearly as effective as a comparable ANN. This method, which we term Whetstone inspired by the tool to sharpen a dull knife, is intentionally agnostic to both the type of ANN being trained and the targeted neuromorphic hardware. Rather, the intent is to provide a straightforward interface for machine learning researchers to leverage the powerful capabilities of low-power neu-romorphic hardware on a wide range of deep learning applications Whetstone can train neural nets through Keras to be "spiking" without an expansion of the network or an expensive temporal code
    2019-01-29  
2 Documents (Long List
Properties