@prefix rdf: . @prefix sl: . @prefix skos: . @prefix rdfs: . @prefix tag: . @prefix foaf: . @prefix dc: . tag:artificial_neural_network a sl:Tag ; skos:prefLabel "Neural networks" . tag:spiking_neural_network a sl:Tag ; rdfs:isDefinedBy ; sl:comment "ANN models that more closely mimic natural neural networks. In addition to neuronal and synaptic state, SNNs also incorporate the concept of time. Neurons in the SNN do not fire at each propagation cycle (as it happens with typical multi-layer perceptron networks), but rather fire only when a membrane potential – an intrinsic quality of the neuron related to its membrane electrical charge – reaches a specific value. When a neuron fires, it generates a signal which travels to other neurons\r\n" ; skos:broader tag:artificial_neural_network ; skos:prefLabel "Spiking Neural Network" ; foaf:page tag:spiking_neural_network.html . tag:backpropagation a sl:Tag ; skos:prefLabel "Backpropagation" . dc:title "Enabling Spike-Based Backpropagation for Training Deep Neural Network Architectures" ; sl:comment "> However, the ANN-SNN conversion scheme fails to capture the temporal dynamics of a spiking system" ; sl:creationDate "2021-04-19" ; sl:tag tag:spiking_neural_network , tag:discute_avec_raphael , tag:computational_neuroscience , tag:backpropagation . dc:title "Spiking Neural Networks, the Next Generation of Machine Learning (2018)" ; sl:creationDate "2019-01-29" ; sl:tag tag:spiking_neural_network . dc:title "Training deep neural networks for binary communication with the Whetstone method | Nature Machine Intelligence" ; sl:comment "> Here, we describe a new approach to training SNNs, where the ANN training is to not only learn the task, but to produce a SNN in the process. Specifically, if the training procedure can include the eventual objective of low-precision communication between nodes, the training process of a SNN can be nearly as effective as a comparable ANN. This method, which we term Whetstone inspired by the tool to sharpen a dull knife, is intentionally agnostic to both the type of ANN being trained and the targeted neuromorphic hardware. Rather, the intent is to provide a straightforward interface for machine learning researchers to leverage the powerful capabilities of low-power neu-romorphic hardware on a wide range of deep learning applications\r\n\r\nWhetstone can train neural nets through Keras to be \"spiking\" without an expansion of the network or an expensive temporal code\r\n" ; sl:creationDate "2019-01-29" ; sl:tag tag:spiking_neural_network . tag:python a sl:Tag ; skos:prefLabel "Python" . tag:computational_neuroscience a sl:Tag ; skos:prefLabel "Computational Neuroscience" . tag:discute_avec_raphael a sl:Tag ; skos:prefLabel "Discuté avec Raphaël" . tag:github_project a sl:Tag ; skos:prefLabel "GitHub project" . dc:title "jeshraghian/snntorch: Deep learning with spiking neural networks in Python" ; sl:comment "a Python package for performing gradient-based learning with spiking neural networks" ; sl:creationDate "2021-07-26" ; sl:tag tag:spiking_neural_network , tag:python , tag:github_project .